Linear Regression
Article Directory
learning target
- Ownership in the implementation process of linear regression
- Application LinearRegression or SGDRegressor achieve regression prediction
- We know the assessment criteria and formulas regression algorithm
- Overfitting know the causes and solutions underfitting
- We know principle ridge regression and linear regression differences
- Application Ridge achieve regression prediction
- Application joblib achieve saving and loading models
2.6 linear regression api re-introduced
- sklearn.linear_model.LinearRegression(fit_intercept=True)
- By optimizing the normal equation
- fit_intercept: whether the calculated offset
- LinearRegression.coef_: regression coefficient
- LinearRegression.intercept_:偏置
- sklearn.linear_model.SGDRegressor(loss=“squared_loss”, fit_intercept=True, learning_rate =‘invscaling’, eta0=0.01)
- SGDRegressor class implements a stochastic gradient descent learning, it supports different loss function and the regularization penalty terms to fit linear regression model.
- loss: loss type
- loss = "squared_loss": ordinary least squares
- fit_intercept: whether the calculated offset
- learning_rate : string, optional
- Learn filling rate
- 'Constant', and = eta0
- ’optimal’: eta = 1.0 / (alpha * (t + t0)) [default]
- ‘invscaling’: eta = eta0 / pow(t, power_t)
- power_t = 0.25: the presence of the parent class which
- For a learning rate constant value, it can be used learning_rate = 'constant', and the learning rate used to specify eta0.
- SGDRegressor.coef_: regression coefficient
- SGDRegressor.intercept_: Bias
sklearn API available to us two implementations, can choose to use