Continuously reduce the learning rate and improve the training accuracy

       After reading the relevant papers of DCGAN, one of the chapters introduces that the accuracy of training can be improved through the continuous decline of the learning rate. This article briefly introduces the effect of the experiment. This article does not use the formula for the decline of the learning rate in the paper, but simply to shrink.

       The figure above is the calculation formula in the paper, where decay_rate is a hyperparameter and the value given in the article is 0.95. Next, briefly introduce the overall design. The data set uses CIFAR-10, which recognizes ten categories and has a total of 5 rounds of training. The code uses the code of a big guy on kaggle.

       The picture above is the final training result. It can be seen that the effect is not particularly good, so I try to see if the accuracy is improved by gradually reducing the learning rate. This article has done two experiments. In the first experiment, the learning rate was reduced twice, once by ten times. In the second experiment, the learning rate was reduced twice, once by 5 times.

 The figure above shows the loss without changing the learning rate.

 The figure above shows the loss obtained by reducing the learning rate by 10 times in the fourth and fifth rounds respectively. It can be seen that the loss has been reduced a little.

 The above three graphs show the loss obtained by reducing the learning rate by 5 times in the fourth round and the sixth round respectively. It can be seen that the loss has been reduced a lot, and the final accuracy rate has also increased by about 10%. Of course, we trained for 7 rounds this time, but the learning rate was much lower in the fifth round than when the loss rate was reduced by ten times.

 Finally, the method of changing the learning rate is posted, using pytorch. Of course, this picture is not the code of this experiment, but just change the parameters inside.

 

 

Guess you like

Origin blog.csdn.net/qq_45710342/article/details/121658709