2, Linear Regression (Linear Regression) model - supervision, regression algorithm

1, the linear regression (Linear Regression) model

Linear regression is the use of mathematical statistics, regression analysis, to determine a statistical quantitative relationship between two or more interdependent variables analysis methods, the use of very extensive. Regression analysis includes only one independent variable and a dependent variable, and the relationship between the two approximated straight line is available, this is called regression analysis, a linear regression analysis. If the regression analysis comprises two or more independent variables, and because a linear relationship between the dependent and independent variables, referred to as the multiple linear regression analysis. By high-dimensional maps available basis for many more powerful non-linear model (nonlinear model) can be linear model.

Conditions : Main type or nominal value for the data type. (1) whether the independent and dependent variables from a linear relationship; (2) whether the dependent variable with normal distribution; (3) whether the value of the independent variable by between; (4) whether homogeneity of variance.

2, the system model

Linear regression problems encountered in the general case. We have $ m $ samples , each sample corresponding to $ $ n- dimensional feature and a result output .

Training data in the form:
$ (x_1 ^ {(0)}, {x_2 ^ (0)}, ... {x_n ^ (0)}, y_0), (x_1 ^ {(. 1)}, {x_2 ^ ( 1)}, ... x_n ^ { (1)}, y_1), ... (x_1 ^ {(m)}, x_2 ^ {(m)}, ... x_n ^ {(m)}, y_n ) $

We mainly do is by finding the parameter $ \ left ({b {\ rm {,}} {w_1} {\ rm {,}} \ cdots {\ rm {,}} {w_n}} \ right) $ linear regression model is as follows:

$y= \sum\limits_{j = 1}^n {{w_j}} {x_j} + b$

Wherein regression coefficients $ W $, $ b $ bias. For the above formula, so $ {x_0} = 1 $, then the above equation can be expressed as:

$y = \sum\limits_{j = 0}^n {{w_j}} {x_j}$

Matrixed as follows:

$ Y = XW $

3, the configuration of the loss function (the square of the loss function / mean square error )

Linear regression model, the goal is to find the linear regression equation. The evaluation of linear regression refers to how to measure the closeness between the predicted value and label. Linear regression model loss function common quadratic loss function / mean square error (Squared Loss), mainly in its Everywhere. For the above model, which is quadratic loss function:
$ L_w = \ FRAC {. 1} {2} {\ SUM \ limits_ {I =. 1} ^ m {\ left ({{Y ^ {\ left (I \ right)} } - \ sum \ limits_j ^ n {{w_j} x_j ^ {\ left (i \ right)}}} \ right)} ^ 2} \] $

For this reason, we can draw the following minimization problem:

$\mathop {{\rm{min}}}\limits_w {\rm{ }}{l_w}$

4. Solution of linear regression

1) the least squares method

For linear regression models, which predict the function is:

$ Y = XW $

Its loss function is:

${\left( {Y - XW} \right)^T}\left( {Y - XW} \right)$

 

Guess you like

Origin www.cnblogs.com/ai-learning-blogs/p/11344696.html