Cost Function

basic concept

The cost function is also known as the squared error function, and is sometimes called the squared error cost function. The reason we ask for the sum of squared errors is that the squared error cost function is a reasonable choice for most problems, especially regression problems. There are other cost functions that work well, but the squared error cost function is probably the most common approach to regression problems.

related information

Next we'll introduce some terminology. All we have to do now is choose the appropriate parameters for our model insert image description here, which are the slope of the line and the intercept on the y-axis.
The parameters we choose determine the accuracy of the straight line we get relative to our training set. The gap between the value predicted by the model and the actual value in the training set (indicated by the blue line in the figure below) is the modeling error (modeling error) [that is, the absolute value of y minus y-hat].

insert image description here

Cost function formula

Our goal is to choose the model parameters that minimize the sum of squared modeling errors. Even if the cost function
insert image description here

minimum cost function

By constantly changing the values ​​of the slope and intercept to obtain the value of the cost function, we draw a contour map, and the three coordinates are Sum Sum::
insert image description here
It can be seen that there is a minimum point in the three-dimensional space. For the slope and intercept of this point, the hypothesis h obtained is optimal.

Reference content:
Wu Enda Machine Learning

Guess you like

Origin blog.csdn.net/qq_45833373/article/details/131704962