[Reserved] AdaBoost algorithm
Original: https://blog.csdn.net/v_july_v/article/details/40718799
Here is not reproduced, the original look. But there are several points can be noted below:
- On a basic classifier trained weights are the initial weights under a basic classifier. And classified after each update, the combination of the prediction are classified and the front of a classifier, for example, the original
f3(x)=0.4236G1(x) + 0.6496G2(x)+0.7514G3(x)
- A basic classification is only updated once the weights
- Construction of the basic classifier is stopped when a required accuracy
- Adaboost error upper bound equation shows that: the composite error classifier decreases with minimum error index substantially classifiers
Note that there is a formula for the original wrong:
When the derived error bounds AdaBoost:
W front should plus a sum, can refer to a specific formula of the formula