数据挖掘 — boosting算法

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/qq_20095389/article/details/88119254

回归树
https://blog.csdn.net/weixin_36586536/article/details/80468426

1、adaboost

弱分类器训练过程 https://www.cnblogs.com/chenpi/p/5128235.html
adaboost原理
https://www.cnblogs.com/pinard/p/6133937.html?utm_source=tuicool&utm_medium=referral
https://blog.csdn.net/weixin_38629654/article/details/80516045
https://blog.csdn.net/starter_____/article/details/79328749

2、GBDT

主要思想为拟合残差,均方根误差 — 因此相当于弱分类器是回归树
https://blog.csdn.net/blank_tj/article/details/82262431

3、XGBoost

GBDT的改进版,计算弱分类器时,loss函数=均方根误差+正则(叶子结点值得均方)
https://www.cnblogs.com/jiangxinyang/p/9248154.html

4、LightBGM

采用bins
https://blog.csdn.net/qq_24519677/article/details/82811215

猜你喜欢

转载自blog.csdn.net/qq_20095389/article/details/88119254
今日推荐