Linear Algebra - least square method and the projection matrix

Statistically the least squares method is an important concept, this article will explain it to fit the nature of the curve and its links with the projection matrix

Take for example a least squares fit to a straight line

The total error sum of squares as the errors, the minimum total error can be obtained by best fit linear

Assuming that y is ax + b, a and b respectively derivative, since such a function is mostly concave function, taking the partial derivative to zero so when the extreme value, not expand here.

This approach is minimization of seek error vector norm from the one-dimensional image, but we may look at this method from high latitudes, first y-yi listed

ax1+b=y1-e1

ax2+b=y2-e2

ax3+b=y3-e3

ax4+b=y4-e4

Determinant? Then we can convert it into a matrix

 

Is not that Ax = b it is more accurate to say y1, y2, y3, y4 is our fitting target, e1, e2, e3, e4 is an error, this formula is Ax = be

Ax = b is solvable if e = 0, x is in this case we need an answer, but tends to be far more than the number of data sets of variables, b outside the column space, we can not go through the space vector Quasi vector engagement outer space, so we go to the closest fitting vector b, if the two-dimensional plane take (column space) and three vectors (b) for example, the closest vector is b b a projection on a plane, At this time, the minimum e ^ 2, which is the point of the shortest distance to the plane of the square.

Now assume that Ax-b plane perpendicular to the column space to expand, i.e., Ax-b in the null space of A, so that AT (Ax-b) = 0

ATAx Ab- => x = AB * (Ata) ^ - 1

At this time, the coefficient x of the coefficient is the best fit, to the computer algorithm such a matrix provides a fast method of calculating the coefficients of

 

Guess you like

Origin www.cnblogs.com/matrixmlpforever/p/10963240.html