Difference tensor name and node name of
CKPT->pb
Save tensorflow model comes in two forms:
1. ckpt: can restore the map and variables continue to do training
2. pb: The sequence diagram of the variable becomes a constant value ,, you can only do inference; can not continue training
Demo
. 1 DEF freeze_graph (input_checkpoint, output_graph): 2 . 3 '' ' . 4 : param input_checkpoint: . 5 : param output_graph: PB model storage path . 6 : return . 7 void . 8 ' '' . 9 10 # = tf.train.get_checkpoint_state the checkpoint (model_folder) # ckpt whether the file status checking available directory . 11 # # input_checkpoint = checkpoint.model_checkpoint_path ckpt obtained file path 12 is 13 is # specify the output node name, the node name of the node must be present in the original model 14 output_node_names = " InceptionV3 / logits / SpatialSqueeze " # If a plurality of output nodes, the use of '' number spaced 15 16 ############################ Step1: FIG recover from the ckpt: ############# ################################ . 17 Saver = tf.train.import_meta_graph (input_checkpoint + '.meta', clear_devices = true) 18 is Graph = tf.get_default_graph () # default FIG obtained, may be omitted . 19 input_graph_def graph.as_graph_def = () # returns a sequence diagram represents the current of the FIG., may be omitted 20 is 21 is with tf.Session () AS Sess : # will be used as the current default FIG FIG 22 is Saver. restore (Sess, input_checkpoint) # recover and obtain the data in FIG. 23 is 24 ###################### ## Step2: Create a persistent object, specify Sess, FIG, and a sequence of node information output ############## 25 = graph_util.convert_variables_to_constants output_graph_def ( # persistence model, the fixed value of the variable 26 is Sess = Sess, 27 input_graph_def = input_graph_def, # equals: sess.graph_def 28 . output_node_names = output_node_names Split ( " , ")) # If there are a plurality of output nodes separated by commas 29 ######################### Step3: ############## persistence model ######################################### 30 with tf.gfile.GFile (output_graph , " WB ") AS F: # save the model 31 is . F Write (output_graph_def.SerializeToString ()) # serialized output 32 Print ( " % D in OPS The Final Graph. "% len (output_graph_def.node)) # get the current operation has several nodes of FIG. 33 is # for OP in graph.get_operations (): 34 is 35 # Print (op.name, OP .values ()) 36 37 [ 38 is ########################### Called ############# ################### 39 # ckpt model input path 40 input_checkpoint = 'models / model.ckpt-10000' 41 is the path pb output model # 42 is out_pb_path = " models / pb / frozen_model.pb " 43 # calls freeze_graph will ckpt into pb 44 freeze_graph (input_checkpoint, out_pb_path)
Resolve
Freeze_graph function, the most important thing is to determine "the name of the output node", the node name must be present in the original model node to freeze, we need to define the name of the output node.
freeze when only the sub-picture on the output node down cure are required, other unrelated to discard. Because we freeze the purpose of the model is to do next prediction. So, output_node_names node name is usually the last layer network model output, or that is what we predicted target.
When saving of pb, you need to be cured by the specified node name convert_variables_to_constants function;
Difference tensor name and node name of
FIG node name is a node, which contains a number of operations and tensor
tensor is a part of the inside of the node;
To input, for example, "input: 0" is the name of the tensor, and "input" indicates the name of the node
PS: Note tensor name, namely: Node name + ":" + "id number", such as "input: 0"