Notes - deep learning context finishing: 2 loss function.

  • Vein structure
1. 结构
2. 损失
3. 优化

Here Insert Picture Description
Here Insert Picture Description

Question:
Why softmax
existing problems softmax
cross entropy loss of meaning

  • Digression: After a clear principle, it can be improved
使用 softmax 的时候,有一个默认的前提,就是每个物体只属于一类
比如人脸识别,每张脸只能对应一个人

Here Insert Picture Description
Here Insert Picture Description

Thinking:

  1. sigmod binary classification of Loss, the difference between multi-classification Loss of softmax
  2. The difference sigmod and softmax

Euclidean loss

Here Insert Picture Description


Here Insert Picture Description


Here Insert Picture Description


Here Insert Picture Description


Here Insert Picture Description

  • Loss weighting function

Here Insert Picture Description

  • Sample weights

Here Insert Picture Description


Here Insert Picture Description

  • To improve softmax loss

---

Guess you like

Origin blog.csdn.net/chen_holy/article/details/91492766