Machine learning | Python implements XGBoost extreme gradient boosting tree model Q&A

Machine learning | Python implements XGBoost extreme gradient boosting tree model Q&A

Question series

There are a few questions about XGBoost that I would like to ask. 1. What are the calling methods of XGBoost API? 2. How to adjust the parameters?

question answer

  1. The API of XGBoost has two calling methods, one is our common native API, and the other is an API compatible with Scikit-learn API. Scikit-learn API is seamlessly integrated with the Sklearn ecosystem.
  1. For XGBoost, the default hyperparameters can work normally, but if you want to get the best results, you need to adjust some hyperparameters to match your data. The following parameters are very important for XGBoost:

eta
num_boost_round
max_depth
subsample
colsample_bytree
gamma
min_child_weight
lambda
alpha


Guess you like

Origin blog.csdn.net/kjm13182345320/article/details/132592381