tensorflow 选择性fine-tune(微调)

'''
多种选择fine-tune部分权重的方法

Multiple selection load weights for fine-tune

注:slim的fine-tune 适用slim模块conv bn(尽量一致对应模块)

第一种没有试验过:个人理解:可以不用管 不需要fine-tune的权重 直接tf.global_variables_initializer即可 (slim模块就是这样实验的)

'''
import tensorflow as tf
from tensorflow.contrib import slim

model_path=r'**/model.ckpt'

# 1
sess=tf.Session()
var = tf.global_variables() # all weights' name dict
var_to_restore = [val  for val in var if 'conv1' in val.name or 'conv2'in val.name] # select which to fine-tune
saver = tf.train.Saver(var_to_restore)
saver.restore(sess, model_path)# fine-tune those weights (names must be same)
var_to_init = [val  for val in var if 'conv1' not in val.name or 'conv2'not in val.name] # need to init
tf.variables_initializer(var_to_init)
# when save model ,need to create a new saver

# 2
sess=tf.Session()
exclude = ['conv1', 'conv2'] # need write the actual name
variables_to_restore = slim.get_variables_to_restore(exclude=exclude)
saver = tf.train.Saver(variables_to_restore)
saver.restore(sess, model_path)
# when save model ,need to create a new saver

# 3
exclude = ['weight1','weight2']
variables_to_restore = slim.get_variables_to_restore(exclude=exclude)
init_fn = slim.assign_from_checkpoint_fn(model_path, variables_to_restore)

init_fn(sess)
具体实现代码: https://blog.csdn.net/leilei18a/article/details/80189917
'''
个人见解: 如果是要fine-tune前面连续的层权重,可以先建立网络,然后在需要fine-tune层
最后位置添加init_fn = slim.assign_from_checkpoint_fn(model_path, slim.get_model_variables('model_name')) # model_name: eg resnet_v2_152
然后,在tf.global_variables_initializer 执行后,添加init_fn(sess),即: 只fine-tune
前面某些层权重。【当然也可以在最后弄,选择加载即可其他的自动会init】
别人点醒我:init_fn在网络哪个位置,此位置之前会自动从get_model_variables中提取相同名字的
权重。(尽量名字一致)
'''

猜你喜欢

转载自blog.csdn.net/leilei18a/article/details/80215319