Least Squares Fit Straight Line
Generate sample points
First, we y = 3 + 5x
generate random points that obey the normal distribution near the line as sample points for fitting the line.
import numpy as np
import matplotlib.pyplot as plt
# 在直线 y = 3 + 5x 附近生成随机点
X = np.arange(0, 5, 0.1)
Z = [3 + 5 * x for x in X]
Y = [np.random.normal(z, 0.5) for z in Z]
plt.plot(X, Y, 'ro')
plt.show()
The sample points are shown in the figure:
Fit straight line
Suppose y = a0 + a1*x
, we use the regular equation system of the least square method to solve the unknown coefficients a0 and a1.
There is a solve function in numpy's linalg module, which can solve unknowns according to the coefficient matrix of the equation system and the vector formed by the right-hand side of the equation.
def linear_regression(x, y):
N = len(x)
sumx = sum(x)
sumy = sum(y)
sumx2 = sum(x**2)
sumxy = sum(x*y)
A = np.mat([[N, sumx], [sumx, sumx2]])
b = np.array([sumy, sumxy])
return np.linalg.solve(A, b)
a0, a1 = linear_regression(X, Y)
draw straight lines
At this point, we have obtained the fitted line equation coefficients a0 and a1. Next, we draw this line and compare it with the sample points.
# 生成拟合直线的绘制点
_X = [0, 5]
_Y = [a0 + a1 * x for x in _X]
plt.plot(X, Y, 'ro', _X, _Y, 'b', linewidth=2)
plt.title("y = {} + {}x".format(a0, a1))
plt.show()
The fitting effect is as follows:
Least Squares Fitting Curve
Generate sample points
Similar to generating straight line sample points, we y = 2 + 3x + 4x^2
generate random points that obey the normal distribution near the curve as sample points for the fitted curve.
import numpy as np
import matplotlib.pyplot as plt
# y = 2 + 3x + 4x^2
X = np.arange(0, 5, 0.1)
Z = [2 + 3 * x + 4 * x ** 2 for x in X]
Y = np.array([np.random.normal(z,3) for z in Z])
plt.plot(X, Y, 'ro')
plt.show()
The sample points are shown in the figure:
Curve fitting
Let the equation of this curve be y = a0 + a1*x + a2*x^2
, again, we solve the unknowns a0, a1, and a2 by a system of canonical equations.
# 生成系数矩阵A
def gen_coefficient_matrix(X, Y):
N = len(X)
m = 3
A = []
# 计算每一个方程的系数
for i in range(m):
a = []
# 计算当前方程中的每一个系数
for j in range(m):
a.append(sum(X ** (i+j)))
A.append(a)
return A
# 计算方程组的右端向量b
def gen_right_vector(X, Y):
N = len(X)
m = 3
b = []
for i in range(m):
b.append(sum(X**i * Y))
return b
A = gen_coefficient_matrix(X, Y)
b = gen_right_vector(X, Y)
a0, a1, a2 = np.linalg.solve(A, b)
draw curve
We draw the image of the curve according to the obtained curve equation.
# 生成拟合曲线的绘制点
_X = np.arange(0, 5, 0.1)
_Y = np.array([a0 + a1*x + a2*x**2 for x in _X])
plt.plot(X, Y, 'ro', _X, _Y, 'b', linewidth=2)
plt.title("y = {} + {}x + {}$x^2$ ".format(a0, a1, a2))
plt.show()
The fitting effect is as follows:
Original: http://www.codebelief.com/article/2017/04/matplotlib-demonstrate-least-square-regression-process/