Pytorch loads some of the parameters in the training model before loading and some frozen parameters (measured, in the actual project code)

My demand is that because I keep trying various models, the model blocks will always change. It will take a lot of time to restart training every time.

The model I ran before was ResNet -> three ResNet parameters were shared.

                              ResNet -> Intermediate Module -> Results

                              ResNet ->

 

Now I want to change to ResNet 1-> three ResNets without parameter sharing to retrain. I want to import the parameters of the middle module of the previous model,

                              ResNet 2-> Intermediate Module -> Result    

                              ResNet 3->

And freezing the parameters of the middle module makes the training speed faster.

Refer to the two blog posts of the two great gods: load some parameters https://blog.csdn.net/weixin_41519463/article/details/101604662 , freeze some parameters https://blog.csdn.net/jdzwanghao/article/details/ 83239111 .

The specific code is as follows:

net = MY_Net( )
######导入部分参数
	model_dict = net.state_dict()
	for k, v in model_dict.items():
		print(k)

	pretrained_dict = torch.load(model_file1)#model_file1是之前模型的模型保存路径,这里只是加载参数而已
	for k, v in pretrained_dict.items():
		print(k)
	pretrained_dict = {k: v for k, v in pretrained_dict.items() if k in model_dict}

	model_dict.update(pretrained_dict)  # 用预训练模型参数更新new_model中的部分参数

	net.load_state_dict(model_dict)  # 将更新后的model_dict加载进new model中


##### 冻结部分参数
	for param in net.parameters():
		param.requires_grad = False#设置所有参数不可导,下面选择设置可导的参数
	for param in net.ResNet1.parameters():
		param.requires_grad = True
	for param in net.ResNet2.parameters():
		param.requires_grad = True
	for param in net.ResNet3.parameters():
		param.requires_grad = True

optimizer = optim.SGD(filter(lambda p: p.requires_grad, net.parameters()), lr = 0.0001, momentum=0.90,weight_decay=0.0005)#关键是优化器中通filter来过滤掉那些不可导的参数

 

Guess you like

Origin blog.csdn.net/qq_36401512/article/details/105076090