Machine Learning What is Cross Entropy Cross Entropy

Cross Entropy , there are many accurate and professional explanation articles on the Internet, but can you tell me what Cross Entropy is in a simple way?

PS: This article is only for the record and discussion of the learning process

Cross Entropy: A loss function generally used to quantify the difference between two probability distributions (mostly used in classification problems) .

For example, in a multi-classification problem, the true distribution p of an object belonging to different classes is as follows:

true classification probability

That is, the true classification is that the object belongs to Class B .

Then, you hand a machine learning prediction model, and the prediction distribution q is as follows:

predicted classification probability

You frowned: How far is my prediction from the actual result? (This is what Cross Entropy can do)

So you moved out a formula (for the specific derivation process, please refer to the reference link [2])

Cross Entropy Official

Bring the above two data into the formula and calculate manually:

 

Therefore, the total Cross Entropy loss of your prediction model is: 0.479, which also represents how far away your prediction is from the true distribution.

The above is probably the basic concept and usage of Cross Entropy. If you want to continue to go deeper into the basic principles of Cross Entropy, you can read the relevant reference materials ( Cross-Entropy - rtygbwwwerr's column - CSDN blog ).

In addition, Cross Entropy is only a type of loss function (Cross Entropy is commonly used in classification problems), and what to use depends on your model and task. After calculating the Cross Entropy Loss, you can also use Gradient Descent to update the parameters.

Finally, regarding Softmax and ont-hot, this is also a commonly used method for multi-classification problems. For example, Softmax can normalize the output of the neural network to the (0, 1) interval, so the output of Softmax can also be regarded as a probability, so as to use Cross Entropy calculates Loss.

References:

[1] Cross-entropy cost function - wepon's column - CSDN Blog

[2] Cross-Entropy (Cross-Entropy) - rtygbwwwerr's column - CSDN Blog

[3] Briefly talk about Cross Entropy Loss

[4]What is cross-entropy?

[5] Weekend Q&A 0: What is Croos Entropy?

Original: Notes | What is Cross Entropy?

Welcome to pay attention to public consumption: microprogram school

Guess you like

Origin blog.csdn.net/u013288190/article/details/124442886