Deep learning (23): SmoothL1Loss loss function

0. Basic introduction

SmoothL1Loss is a commonly used loss function, which is usually used in regression tasks. Its advantage over the mean square error (MSE) loss function is that it has smaller penalties for outliers (such as too large or too small outliers), thus Make the model more robust.

The formula for SmoothL1Loss is:

l o s s ( x , y ) = { 0.5 ( x − y ) 2 if  ∣ x − y ∣ < 1 ∣ x − y ∣ − 0.5 otherwise loss(x,y) = \begin{cases} 0.5(x-y)^2 & \text{if } |x-y| < 1 \\ |x-y| - 0.5 & \text{otherwise} \end{cases} loss(x,y)={ 0.5(xy)2xy0.5if xy<1otherwise

where x and y are the output and label of the model, respectively, and |xy| represents the difference between them. When |xy| is less than 1, square error is used; otherwise linear error is used. This makes SmoothL1Loss more robust than MSE, that is, it responds more gently to outliers.

In PyTorch, you can use the nn.SmoothL1Loss() function to build the SmoothL1Loss loss function.

1. Drawing SmoothL1Lossfunction

By modifying torch.linspacethe parameters, the abscissa range of the image can be changed; by modifying torch.zerosthe parameters, the height and shape of the image can be changed.

import torch.nn as nn
import matplotlib.pyplot as plt
import torch

# 定义函数和参数
smooth_l1_loss = nn.SmoothL1Loss(reduction='none')
x = torch.linspace(-1, 1, 10000)
y = smooth_l1_loss(torch.zeros(10000), x)

# x2 = 1e3*x
# y2 = 1e-3*smooth_l1_loss(torch.zeros(10000), x2)

# 绘制图像
plt.plot(x, y)
# plt.plot(x, y2)
plt.xlabel('x')
plt.ylabel('SmoothL1Loss')
plt.title('SmoothL1Loss Function')
plt.show()

insert image description here

2. Move the critical point of the SmoothL1Loss formula

Moving the critical point is to amplify the loss of the model without exhausting other operations

move to 0.1

import torch.nn as nn
import matplotlib.pyplot as plt
import torch

# 定义函数和参数
smooth_l1_loss = nn.SmoothL1Loss(reduction='none')
x = torch.linspace(-1, 1, 10000)
y = smooth_l1_loss(torch.zeros(10000), x)

x2 = 1e1*x
y2 = 1e-1*smooth_l1_loss(torch.zeros(10000), x2)

# 绘制图像
plt.plot(x, y)
plt.plot(x, y2)
plt.xlabel('x')
plt.ylabel('SmoothL1Loss')
plt.title('SmoothL1Loss Function')
plt.show()

As follows, keep playing with the bend at the arrow
insert image description here

move to 0.01

As follows, the bend can also be seen at the arrow

import torch.nn as nn
import matplotlib.pyplot as plt
import torch

# 定义函数和参数
smooth_l1_loss = nn.SmoothL1Loss(reduction='none')
x = torch.linspace(-1, 1, 10000)
y = smooth_l1_loss(torch.zeros(10000), x)

x2 = 1e2*x
y2 = 1e-2*smooth_l1_loss(torch.zeros(10000), x2)

# 绘制图像
plt.plot(x, y)
plt.plot(x, y2)
plt.xlabel('x')
plt.ylabel('SmoothL1Loss')
plt.title('SmoothL1Loss Function')
plt.show()

insert image description here

Guess you like

Origin blog.csdn.net/BIT_HXZ/article/details/130458821