"Wu Enda Machine Learning" content summary Week2

Course URL: Machine Learning | Coursera

The main content of the second week has two parts: multiple linear regression and Octave tutorial.

1. Multiple Linear Regression

1. Basic model

 2. Multivariate Gradient Descent

3. Feature Scaling

      Control the range of each feature approximately in [-1,1]

      Mean normalization:x =\tfrac{ x - \mu }{\sigma }

4. Learning rate\alpha

      If \alphait is too small, gradient descent will be slow.

      If \alphait is too large, the cost function may not decrease with each iteration and may not converge.

      Tried with triple increments, ..., 0.001, 0.003, 0.01, 0.03, 0.1, 0.3, 1, .....

5. Normal equation solution method

                                                          \theta = (X_{}^{T}X)_{}^{-1}X_{}^{T}y

Octave:pinv(X' * X)*X' * y

6. Comparison of gradient descent and normal equation method

 When n is too large, it is more inclined to use the gradient descent algorithm.

Two, Octave tutorial (omitted)

 

Guess you like

Origin blog.csdn.net/cxzgood/article/details/120683576