[笔记]stanford engineering cs224n lecture2

concentrate on understanding(deep, multi-layer) neural networks and how they can be trained(learn from data) using back propagation (the judicious application of matrix calculus矩阵运算)

look at an nlp classifier that adds context by taking in windows around a word and classifies the center word(not just representing it across all windows)看看一个非分类器,它通过在一个词周围的窗口中添加上下文,并对中心词进行分类(而不仅仅是在所有窗口中表示它)。

we learn about putting syntactic structure(dependency parses从语法上分析) over sentence

develop the notion of the probability of a sentence (a probabilistic language model) and why it is really useful

猜你喜欢

转载自blog.csdn.net/deardeerluluu/article/details/89280970