xgboost 在一次训练中得到iteration里最好的模型,而不是最后一个iteration的模型

[240]   train-logloss:0.263565  valid-logloss:0.392514
[250]   train-logloss:0.261231  valid-logloss:0.392377
[260]   train-logloss:0.257999  valid-logloss:0.392149
[270]   train-logloss:0.254814  valid-logloss:0.39179
[280]   train-logloss:0.251346  valid-logloss:0.39179
[290]   train-logloss:0.248382  valid-logloss:0.391635
[300]   train-logloss:0.245682  valid-logloss:0.392021
[310]   train-logloss:0.243229  valid-logloss:0.392104
[320]   train-logloss:0.241036  valid-logloss:0.392591
Stopping. Best iteration:
[292]   train-logloss:0.247746  valid-logloss:0.391567

如果训练时设置了early_stopping_rounds参数,则可以

ypred = bst.predict(dtest, ntree_limit=bst.best_ntree_limit)

bst是训练好的模型
我是参看
https://xgboost.readthedocs.io/en/latest/python/python_intro.html

猜你喜欢

转载自blog.csdn.net/guotong1988/article/details/80611864