TensorFlow use record (IV): model save

Model file

Note saved tensorflow training model consists of two parts: the network structure and parameter values.

.meta

 FIG .meta file stores the entire structure of the model to "protocol buffer" format, information on the model defined operation.

 .data & .index

.data and .index ckpt document together comprise a file save numerical network structure of all weights and bias.

.Data stored value of the variable file, the file is saved .index .data correspondence between the file and the data file structure of FIG .meta

 

View ckpt model file saved Tensor information:

import tensorflow as tf

checkpoint_path = 'cnn_mnist.ckpt'
reader = tf.pywrap_tensorflow.NewCheckpointReader(checkpoint_path)
var_to_shape_map = reader.get_variable_to_shape_map()

# Print tensor name and values
for key in var_to_shape_map:
    print("tensor_name: ", key)
    print(reader.get_tensor(key))

Model save tf.train.Saver

saver = tf.train.Saver()
saver.save(sess,"model_test.ckpt")

Saver class constructor is defined:

def __init__(self,
             var_list = None, # specify the sequence or dictionary of variables to be saved, the default is None, save all variables in 
             the RESHAPE = False,
             sharded=False,
             max_to_keep = 5, # define save up the nearest number of model file 
             keep_checkpoint_every_n_hours = 10000.0 ,
             name=None,
             restore_sequentially=False,
             saver_def=None,
             builder=None,
             defer_build=False,
             allow_empty=False,
             write_version=saver_pb2.SaverDef.V2,
             pad_step_number=False,
             save_relative_paths=False,
             filename=None):

saver.save function definition:

def save(self,
         sess,                      # the current session environment 
         save_path,                 # model save path 
         global_step = None,          # training rounds, if added, will add the rounds after the model file name suffix 
         latest_filename = None,      # Name checkpoint text file, default to 'checkpoint' 
         meta_graph_suffix = " Meta " , # suffix saved network diagram showing the configuration files 
         write_meta_graph = True,     # define whether to save the network structure 
         WRITE_STATE = True,
         strip_default_attrs=False,
         save_debug_info=False):

Load Model 

When loading model can be loaded FIG configuration parameters (operation in Session) in FIG reload:

saver=tf.train.import_meta_graph('./model_saved/model_test.meta')
saver.restore(sess, tf.train.latest_checkpoint('./model_saved'))

Or disposable loading:

saver = tf.train.Saver()
saver.restore(sess, './model_saved/model_test.ckpt')
# or
saver.restore(sess, tf.train.latest_checkpoint('./model_saved'))

 

Guess you like

Origin www.cnblogs.com/xuanyuyt/p/11624078.html