Mathematical Principles of machine learning multiple linear regression derivation

//2019.08.05
# solutions of multiple linear regression algorithm, and normal equation
1, for the multiple linear regression algorithm, that the data set has better explanatory , but we can compare the characteristic parameters of the magnitude of the output coefficient data to determine its the impact of weight , which in turn implied parameters for expansion and collection, improve the accuracy of the overall training data.
2, and for comparative KNN algorithm may know multiple linear regression algorithm, KNN algorithm is nonparametric learning algorithm, the algorithm is a multiple linear regression algorithm to learn parameters, no additional assumptions KNN algorithm data, and multiple linear regression algorithm is a hypothetical data: a linear relationship between the data.
3, multiple linear regression algorithm is the basis for many classification algorithms, it can only solve regression problems, not solve classification problems, KNN classification algorithm can solve both problems, but also solve the problem of return.
4, multiple linear regression algorithm Method using normal equations , Equations obtained with high temporal complexity, is not conducive to large data calculated.

5, multiple linear regression algorithm for the particular mathematical principles derivation is as follows:

In sklearn library by multiple linear regression algorithm training base data set, you can get a whole each coefficient and the intercept, by calling

.coef_ .intercept_ and can output value of the corresponding coefficient and intercept , to obtain different characteristics for the final result of the impact.

Guess you like

Origin www.cnblogs.com/Yanjy-OnlyOne/p/11302588.html