Linear regression (binary linear regression - the custom function Solving) --04

Multiple Features

The foregoing has explained the linear regression, such a case only one independent variable and a dependent variable, i.e. a single feature

But more often, we need to consider is the case of multiple features, as follows:

In the image above, you can see, there are several factors to predict a certain house prices, it may influence

According to the above description, is not difficult to carry out summary

Next, the multiple regression equation can be obtained, and the corresponding cost function gradient descent

Gradient descent method given above, the following description of the multiple linear regression for the gradient descent method

According to the above description, in the following Liezi let to construct a multiple linear regression equation

Intermediate code above, you can see, it has been loaded in the table above intermediate content

After the above data acquired data segmentation

After completion of the above, the cost function can be defined, and since the function is a gradient algorithm, as follows:

After the definition of a good algorithm, the following can begin to calculate the

It can be seen in the beginning of the cost function corresponding to a deviation or as much as 47 (large error), after 1000 iterations, the error is 0.7 (less error)

Of course, this pattern corresponding to the binary linear regression are present

Intermediate may not be very understanding np.meshgrid (x0, x1), can be explained with reference to the following

The above display at a point above the plane, the following pattern:

Guess you like

Origin blog.csdn.net/Escid/article/details/90228547