pytorch 梯度裁剪

版权声明:任意转载 https://blog.csdn.net/Z609834342/article/details/84035179
optimizer.zero_grad()        
loss, hidden = model(data, hidden, targets)
loss.backward()

torch.nn.utils.clip_grad_norm(model.parameters(), args.clip)
optimizer.step()

猜你喜欢

转载自blog.csdn.net/Z609834342/article/details/84035179
今日推荐