The function of normalize in torch is to unitize a vector. For example, a vector is:
x = [1,2,3,4]
The standardization process is to first find the second norm of this vector, and then divide the value of each dimension by the second norm.
import torch.nn.functional as F
import torch
x = torch.randn(1,3)
print(x)
z = F.normalize(x)
print(z)
The running result is:
tensor([[-1.0407, -1.1139, -0.9541]])
Let a vector be x, then x.norm() returns the second norm of this vector
import torch
x = torch.tensor([1.0,2.0,3.0])
print(x.norm())
The output is:
tensor(3.7417)
So if you want to unitize a vector, you can directly F.normalize(x), or x = x / x.norm()