pytorch Learning (II) learning rate adjustment

PyTorch offers six learning rate adjustment method can be divided into three categories, namely,

1. orderly adjustment;
2. adaptive adjustment;
3. Biasing.

The first category, according to certain rules orderly adjustment, this is the most common type, are lowered intervals (the Step),
set on demand drop interval (Multistep), index decreased (Exponential) and CosineAnnealing. The four parties
to adjust the timing of laws are man-made controllable, and it is often used in training.
The second category, according to the state of waiting for an opportunity to adjust the training, which is ReduceLROnPlateau method. The method by monitoring a
change of index, when the index is no longer how change is to adjust the timing of the learning rate, and thus belong to adaptively
be adjusted accordingly.
The third category, biasing, Lambda. Lambda adjustment strategy approach provides a very flexible, we can not
set different rates of learning with a layer adjustment method, which is useful in fine-tune, we not only different layers
of different learning rate is set, you can also its set different learning rate adjustment strategy, not just even better!

Guess you like

Origin blog.csdn.net/j879159541/article/details/91958280