AdaBoost (Adaptive Boosting) is an ensemble learning method whose goal is to combine multiple weak classifiers into a strong classifier
By repeatedly modifying the weights of the training data, previously misclassified samples receive more
In each round, a new weak classifier is added until a certain predetermined error rate is reached or a predetermined maximum number of iterations is reached