keras.callbacks.ModelCheckpoint(filepath, monitor='val_loss', verbose=0, save_best_only=False, save_weights_only=False, mode='auto', period=1)
Filepath to save the model after each epoch.
parameter:
- filepath: Save path model.
- monitor: the monitored data. val_acc or val_loss.
- verbose: details mode 0 1.0 or not to print output, a print.
- save_best_only: If save_best_only = True, the model will only save the best performance on the validation set mode: {auto, min, max} of one. If save_best_only = True, then decide whether to overwrite save the file depends on the maximum or minimum value of the monitored data. For val_acc, max mode would be; and for val_loss, a model needs min. In the auto mode, the mode is automatically determined from the name of the data being monitored.
- save_weights_only: If True, only the right model weight is saved (model.save_weights (filepath)), otherwise, the whole model will be saved (model.save (filepath)).
- period: the interval between each checkpoint (the number of training rounds).
Training process:
1, into classes from keras.callbacks ModelCheckpoint
from keras.callbacks import ModelCheckpoint
2, implemented in the following code is added after the training phase model.compile every epoch (period = 1) to save the best parameters
checkpoint = keras.callbacks.ModelCheckpoint(filepath, monitor='val_loss', save_weights_only=True,verbose=1,save_best_only=True, period=1)
3, load a previously saved parameters before model.fit training phase
IF os.path.exists (filepath): model.load_weights (filepath) # If successfully load a previously saved parameters, output the following information Print ( " checkpoint_loaded " )
4, in model.fit added callbacks = [checkpoint] implemented callback
model.fit_generator(data_generator_wrap(lines[:num_train], batch_size, input_shape, anchors, num_classes), steps_per_epoch=max(1, num_train//batch_size), validation_data=data_generator_wrap(lines[num_train:], batch_size, input_shape, anchors, num_classes), validation_steps=max(1, num_val//batch_size), epochs=, initial_epoch=0, callbacks=[checkpoint])