Logistic regression requires knowledge points
- We know the logistic regression function loss
- Optimization of logistic regression to know
- We know sigmoid function
- You know the scenario logistic regression
- Application LogisticRegression realize logistic regression forecast
- We know the difference between precision, recall indicators
- We know how to solve assess imbalances in the sample
- ROC curve to understand the meaning of instructions AUC index size
- Classification_report precise application, recall calculated
- Indicators for computing applications roc_auc_score
Logistic regression introduced api
- sklearn.linear_model.LogisticRegression(solver=‘liblinear’, penalty=‘l2’, C = 1.0)
- Optional solver parameters: { 'liblinear', 'sag', 'saga', 'newton-cg', 'lbfgs'},
- Default: 'liblinear'; algorithm for optimization problems.
- For small data sets, "liblinear" is a good choice, "sag" and 'saga' For large datasets faster.
- For many types of problems, only 'newton-cg', 'sag', 'saga' and 'lbfgs' can handle a number of loss; "liblinear" only "one-versus-rest" classification.
- penalty: regularization of the kind of regularization term also called a penalty term, because too fine a model to prevent, to punish it
- C: regularization efforts and linear regression alpha is a reason
- Optional solver parameters: { 'liblinear', 'sag', 'saga', 'newton-cg', 'lbfgs'},
The small number of default categories as positive examples
LogisticRegression method is equivalent SGDClassifier (loss = "log", penalty = ""), SGDClassifier implements a common stochastic gradient descent learning. The use LogisticRegression achieved SAG,