ML: Linear Regression

What is the linear regression?

Linear regression model refers to the relationship between the dependent and independent variables is linear. Similar to a linear regression:y=wx+b

Given a set of data collection D: \ {(x_1, y_1), (x_2, y_2), ..., (xn, y_n))) \},

It functions as a linear combination ofy(i)=f(x_i)=w_1x_{i1}+w_2x_{i2}+...+w_nx_{in}+b

Reference links: linear regression prediction method

What is the loss function?

Loss function: to measure the error between the predicted value and the true value. Generally use the mean square error

loss^{i}(w,b)=l^{i}(w,b)=\frac{1}{2}(\hat{y}_i-y_i)Which \hat{y}_iis solving the linear regression value, y_iactual value

The overall mean square errorL(w,b)=\frac{1}{n}\sum_{n}^{i=1}l^{i}(w,b)=\frac{1}{n}\sum_{n}^{i=1}\frac{1}{2}(\hat{y_i}-y_i)=\frac{1}{n}\sum_{n}^{i=1}\frac{1}{2}(w^{T}x_i+b-y_i)

among themw=(w_1,w_2,...,w_n),x_i=(x_{i1},x_{i2},...,x_{in})^{T}

What is the optimization function?

And a linear function of the above mentioned loss function is relatively simple, such predictions is called solving analytical solution (analytical solution). Involves a more complex level, such as the depth of learning, we generally limited by the optimization algorithm iterations model parameters to reduce the value of the loss function as possible. Such solution is called numerical solution (numerical solution).

Recommended Links: How to understand the stochastic gradient descent

 

Published 41 original articles · won praise 14 · views 40000 +

Guess you like

Origin blog.csdn.net/fan3652/article/details/104307806