[ZJU-Machine Learning] Feature Selection - Adaptive Boosting Algorithm Adaboost

Heuristics

Incremental method:

First select a feature from x, construct a classifier, and calculate the accuracy. Then select a feature from the remaining features of x, calculate the accuracy, and keep adding until the accuracy begins to decrease.

Decreasing method:

Same as above, except that it starts with all features, and then reduces them one by one until the accuracy drops.

genetic algorithm

simulated annealing algorithm

In fact, neural networks can replace the above methods. As the network continues to be trained, the w value for unobvious features will continue to decrease.

Adaboost

The main idea: find as few features as possible (maybe only one), separate the data set as much as possible, and form a weak classifier. Intensify training (sample multiple times) for the samples that are misclassified by these weak classifiers, and reduce the samples for those that are classified correctly. For this sample, we find one or several new features to maximize the separation of this data set. In this loop, many classifiers and features are obtained. Finally, we weight each classifier's accuracy to obtain all features.

The probability of sample sampling is determined based on the performance of the weak classifier. It is uniformly sampled at the beginning, and then the probability of the sample being sampled is continuously adjusted based on the previous prior.

Insert image description here
Insert image description here
Insert image description here
Insert image description here

theorem

i() indicates the function, the same input value is 1, different input is 0

Insert image description here
Insert image description here
Insert image description here

Guess you like

Origin blog.csdn.net/qq_45654306/article/details/113533671