Deep learning (21) - about the loss and acc in the training process are rock-solid

Deep learning (21) - about the loss and acc in the training process are rock-solid

1. Background description

Recently, I am working on time series, and the thinking is clear and clear. The models are all set up, but the loss and acc are solid and steady. It is still a classification task, because it is the code that has been debugged before, and then I wrote it by hand (very confident in myself!) As a result, this situation occurred.

2. Troubleshooting

If you listen to the brother, you will definitely find the cause of the bug. I removed the time series and directly used my own handwritten code to classify, and the same situation as above appeared, OOTD, numb. debug it! Jumped and jumped at the loss, checked and checked, but no error was found, and a day passed. I checked the code again this morning, oh no, I was so stupid!
My loss: the predicted label and the real label are doing loss

_, output = model(images.to(device))
pred_class = torch.sigmoid(output).gt(0.5).int()
accu_num += torch.eq(pred_class.squeeze_(), labels.to(device)).sum()  # 正确的个数
loss = loss_function(pred_class.float(), labels.to(device).float())

insert image description here

loss = loss_function(pred_class.float(), labels.to(device).float())Ask yourself: Can this loss be written like this? ? ? Isn't this a hard machine? ? ? The inner monologue of the machine: My lord! There are only labels of 0 and 1, what can I learn? ? Isn't this difficult for me? ?

3. Come on! Solve the problem!

The predicted classification results (0, 1, 2, 3, 4... this discrete result) are only used to calculate the exact number of model predictions, and the loss needs to be backpropagated to learn how to change, so it should be him and the difference between the true value! The correct way to write:
loss = loss_function(output.float(), labels.to(device).float())
insert image description here
886, I just hope that I will make less such low-level mistakes in the future!

Guess you like

Origin blog.csdn.net/qq_43368987/article/details/129633594