Andrew Ng's Machine Learning Notes (week2)

Great God Notes   http://scuel.gitee.io/ml-andrewng-notes/week2.html

The following figures or content are added for ease of understanding

4.1 Multiple Features:

4.2 Multivariate Gradient Descent

Ichiyota:


4.3 Gradient Descent - Eigenvalue Scaling


4.4 Gradient Descent - Learning Efficiency


4.5 Features and Polynomial Regression

    We can change the behavior or curve of our hypothetical function by making it a quadratic, cubic or square root function (or any other form). Linear regression can only fit data with a straight line, and sometimes it is necessary to use a curve to fit the data, that is, polynomial regression (Polynomial Regression) . When using polynomial regression, keep in mind that feature scaling is very necessary, such as  The range is 1-1000, then  The range is 1-1000000. If feature scaling is not applied, the range is more inconsistent and more likely to affect efficiency.

4.6 Normal equation (normalized equation):

The derivation formula of the matrix is ​​the key, the picture is reproduced from https://blog.csdn.net/perfect_accepted/article/details/78383434



4.7 Irreversible Normal Equations

Selected Lectures

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325983537&siteId=291194637