keras learning rate

http://machinelearningmastery.com/using-learning-rate-schedules-deep-learning-models-python-keras/


https://stackoverflow.com/questions/39779710/setting-up-a-learningratescheduler-in-keras    (打印出每一个周期的学习率lr的代码)


https://github.com/fchollet/keras/issues/898

直接使用命令设置

import keras.backend as K
sgd = SGD(lr=0.1, decay=0, momentum=0.9, nesterov=True)
K.set_value(sgd.lr, 0.5 * K.get_value(sgd.lr))


或者编写回调类

class decay_lr(Callback):
    ''' 
        n_epoch = no. of epochs after decay should happen.
        decay = decay value
    '''  
    def __init__(self, n_epoch, decay):
        super(decay_lr, self).__init__()
        self.n_epoch=n_epoch
        self.decay=decay

    def on_epoch_begin(self, epoch, logs={}):
        old_lr = self.model.optimizer.lr.get_value()
        if epoch > 1 and epoch%self.n_epoch == 0 :
            new_lr= self.decay*old_lr
            k.set_value(self.model.optimizer.lr, new_lr)
        else:
            k.set_value(self.model.optimizer.lr, old_lr)



decaySchedule=decay_lr(10, 0.95)


With TF backend, I did this (for inception-V3)

from keras.callbacks import LearningRateScheduler
def scheduler(epoch):
    if epoch%2==0 and epoch!=0:
        lr = K.get_value(model.optimizer.lr)
        K.set_value(model.optimizer.lr, lr*.9)
        print("lr changed to {}".format(lr*.9))
    return K.get_value(model.optimizer.lr)

lr_decay = LearningRateScheduler(scheduler)

model.fit_generator(train_gen, (nb_train_samples//batch_size)*batch_size,
                  nb_epoch=100, verbose=1,
                  validation_data=valid_gen,    nb_val_samples=val_size,
                  callbacks=[lr_decay])


以上都是以epoch为周期的,其实每一次minibatch就算一次update(例如model.train_on_batch()), iteration的状态+1, 这就是学习率中decay作用时的iteration数值,并不等于epoch. 一个epoch有多少minibatch,就有多少iteration


猜你喜欢

转载自blog.csdn.net/xiaojiajia007/article/details/79265867
今日推荐