Winter holiday PyTorch tool eighth day

Course Record

From the concept of optimizer to various optims

 

 


course code

slightly

 

 


operation

1. The function of the optimizer is to manage and update parameter groups. Please build an SGD optimizer and add three sets of parameters through the add_param_group method. The learning rates of the three sets of parameters are 0.01, 0.02, 0.03, and the momentum is 0.9, 0.8, 0.7. , After the construction, and print the key and value of each element in the param_groups attribute in the optimizer (hint: param_groups is a list, and each element is a dictionary)

1. The optimizer that controls lr

import torch
import torch.optim as optim


torch.manual_seed(1234)


w1 = torch.randn((2, 2), requires_grad=True)
w2 = torch.randn((2, 2), requires_grad=True)
w3 = torch.randn((2, 2), requires_grad=True)
w1.grad = torch.ones((2, 2))
print(w1.grad, w2.grad, w3.grad)

optimizer = optim.SGD([w1], lr=1, momentum=0.9)
optimizer.add_param_group({"params": w2, 'lr': 2, 'momentum': 0.8})
optimizer.add_param_group({"params": w3, 'lr': 3, 'momentum': 0.7})

print("optimizer.param_groups is\n{}".format(optimizer.param_groups))

optimizer.step()
print(w1, w2, w3)

Guess you like

Origin blog.csdn.net/u013625492/article/details/114238792