GBDT study notes

 

GBDT (Gradient Boosting Decision Tree, Friedman, 1999) Since the proposed algorithm, widely used in various fields. We can see from the name, the algorithm mainly involves three types of knowledge, Gradient gradient, Boosting integration and Decision Tree algorithm decision tree.

The algorithm is GREEDY FUNCTION APPROXIMATION A GRADIENT BOOSTING MACHINE article proposed, Ensemble It can be seen as a model, it may be seen as a model Gradient Boosting based. The underlying algorithm based CART (GBDT mainly based on regression tree) gradient algorithm and drop function space, in addition to having a strong interpretability tree model, efficient processing of the type wherein the mixing telescopic invariant (does not require standardization of data), the missing robust value, etc., also has a strong predictive ability, good stability and other advantages. Classification algorithm compared to its XGboost / LightGBM, GBDT model requires only a loss of function-order derivative, or a convex projection is applicable; and XGboost / LightGBM loss function requirements more stringent, can be a second order order transduction, loss function and requires strictly convex function.

 

 

 

 

https://www.sohu.com/a/227107019_505779

https://www.cnblogs.com/pinard/p/6140514.html

Guess you like

Origin www.cnblogs.com/bnuvincent/p/11221393.html