[Reserved] AdaBoost algorithm

[Reserved] AdaBoost algorithm

Original: https://blog.csdn.net/v_july_v/article/details/40718799

Here is not reproduced, the original look. But there are several points can be noted below:

  1. On a basic classifier trained weights are the initial weights under a basic classifier. And classified after each update, the combination of the prediction are classified and the front of a classifier, for example, the originalf3(x)=0.4236G1(x) + 0.6496G2(x)+0.7514G3(x)
  2. A basic classification is only updated once the weights
  3. Construction of the basic classifier is stopped when a required accuracy
  4. Adaboost error upper bound equation shows that: the composite error classifier decreases with minimum error index substantially classifiers

Note that there is a formula for the original wrong:

When the derived error bounds AdaBoost:

W front should plus a sum, can refer to a specific formula of the formula

Guess you like

Origin www.cnblogs.com/jiading/p/11965651.html