Machine learning models introduced

Supervised learning - regression model

- Linear Regression Model

- Linear Regression (linear regression) is a linear model, it is assumed that there is a linear relationship between the input variables x and output variables y single

- In particular, the use of a linear regression model, X can be from a set of input variables linear combination, calculates the output variable y

 

 

 

Solving linear equations

Suppose we have a following linear equations:

            y = ax + b

We know the two sets of data: x = l when, y = 3, i.e., (1,3)

              When x = 2, y = 5, i.e., (2,5)

The data input in the equation, we obtain:

            a + b = 3

            2a + b = 5

Solutions obtained: a = 2, b = 1

That equation: 2x + 1 = y

When we have any x, the equation input, can be obtained the corresponding y

For example, when x = 5, y = 11.

 

Linear regression models

 

• d has a given attribute (feature) described in Example x = (x1; x2; ...; xd), where xi is the value of x in the i-th attribute (feature) of

Linear model (linear model) by trying to learn a property (characteristic) function to perform a linear combination of prediction, namely:

• - as a vector written in the form:

• assumed linear characteristics and the results are satisfying, i.e., not more than one side.

After • w and b learn, the model is determined.

• Many more powerful foundation of nonlinear model can be a linear model based on high-dimensional map obtained by introducing a hierarchy or.

 

 

Least squares method

• The method of minimizing the mean square error is resolved based on the model called "least squares method" (least square method)

• Its main idea is to choose the unknown parameters so that the difference between the square and the minimum theoretical and observed values.

 

 

 

- We assume that the number of input attributes (features) is only one:

 

- linear regression, the least squares method is to try to find a straight line, so that all the samples to the sum of the minimum Euclidean distance on a straight line.

 

• Solution B and w, so that the   minimization process, called a linear regression model of the "least squares parameter estimation"

• the respective derivative of w and b can be obtained

• make partial derivatives are zero, you can get

 

 --among them:

                                                   

 

Guess you like

Origin www.cnblogs.com/LXL616/p/11229434.html