(1) Introduction

1 Introduction

Word2vec is an open source toolkit for acquiring word vectors launched by Google in 2013, including a set of models for word embedding.

The Word2vec model is usually obtained by using shallow (two-layer) neural network training word vectors, taking a large-scale corpus as input, and then generating a vector space (usually a few hundred dimensions). Each word in the dictionary corresponds to a unique vector in the vector space, and words with a common context in the corpus are mapped closer to the vector space.



reference

Blog: Word2Vec-knowing it knows why

Guess you like

Origin blog.csdn.net/m0_38111466/article/details/107745532