Summary of machine learning piecemeal knowledge points

  1. Cross-entropy can be used as a loss function in neural networks (machine learning), p represents the distribution of real labels, and q is the predicted label distribution of the trained model. The cross-entropy loss function can measure the similarity between p and q. Another benefit of using cross-entropy as a loss function is that using the sigmoid function in gradient descent avoids the problem of reducing the learning rate of the mean squared error loss function, because the learning rate can be controlled by the output error. In feature engineering, it can be used to measure the similarity between two random variables.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325389726&siteId=291194637