【DSSM】Learning Deep Structured Semantic Models for Web Search using Clickthrough Data

Latent semantic models, such as LSA, intend to map a query to its relevant documents at the semantic level where keyword-based matching often fails. In this study we strive to develop a series of new latent semantic models with a deep structure that project queries and documents into a common low-dimensional space where the relevance of a document given a query is readily computed as the distance between them. The proposed deep structured semantic models are discriminatively trained by maximizing the conditional likelihood of the clicked documents given a query using the clickthrough data. To make our models applicable to large-scale Web search applications, we also use a technique called word hashing, which is shown to effectively scale up our semantic models to handle large vocabularies which are common in such tasks. The new models are evaluated on a Web document ranking task using a real-world data set. Results show that our best model significantly outperforms other latent semantic models, which were considered state-of-the-art in the performance prior to the work presented in this paper.

insert image description here

insert image description here
The cosine similarity is most used to calculate the relationship between query and doc.

For a query, negative samples are sampled by negative sampling, and then the probability of positive samples is calculated.
insert image description here

insert image description here

Guess you like

Origin blog.csdn.net/WitsMakeMen/article/details/131523643