Loss function to understand: MSE and Cross Entropy Error

Loss of function and cost function : current understanding is that the loss of function is the cost function, and performing a gradient descent on the basis of loss of function, to find the optimal solution.

Loss function : Depending on the target model, will be divided into return loss function, logistic regression classification loss.

            MSE loss function: FIG measure distance between features, feature extraction target is consistent reasoning FIG. Mean square error (mean square error). MAE will vary with the loss of function, mean absolute error, it is this type of thinking.

            Cross entropy loss function: cross entropy coding by the desired length of the true distribution evolved (refer https://www.cnblogs.com/ljy2013/p/6432269.html ), cross-entropy (cross entropy error) is a measure of the probability of two distribution of p, the similarity between the q. This feature works, to measure the importance of the variables. So cross entropy commonly used in the classification. Expression is a category ✖️ corresponding probability expression. Other Categories 0-1 loss function, such as loss function, after modification with simultaneous expression of the cross-entropy function is the cross entropy loss.

Guess you like

Origin www.cnblogs.com/xiaoheizi-12345/p/12129947.html