Introduction to machine learning (9): regression and clustering algorithms-linear regression, overfitting, ridge regression

Learning Directory:
Insert picture description here

Linear regression:

Insert picture description here
Insert picture description here

Case: Boston housing price estimation (comparing formal equations and gradient descent optimization methods)

Use normal equation optimization:
Insert picture description here
use gradient descent optimization:
Insert picture description here
Insert picture description here
use mean square error (MSE) to evaluate the quality of the model:
Insert picture description here
Insert picture description here

to sum up:

Insert picture description here

Overfitting and underfitting

Regularization category:
**L2 regularization (commonly used): **Add a penalty term after the loss function. This penalty term is related to the weight. When optimizing the loss function to reduce the loss value, it can also reduce the weight of the feature.
L1 regularization:

Insert picture description here

Ridge regression

Is linear regression with L2 regularization
Insert picture description here

Case: Using Ridge Regression to Forecast Holidays in Boston

Insert picture description here
Insert picture description here

Guess you like

Origin blog.csdn.net/qq_45234219/article/details/115048132