ID3 decision tree method

ID3 decision tree in the main use of Shannon entropy concept of entropy represents a degree of chaos, entropy data said that the greater the degree of confusion

The formula for calculating the entropy H = -Σp (xi) log (P (xi)), represented by P (xi) denotes the probability of this happening xi

For each feature selection process is as follows, to obtain the original data set of entropy, and for each feature, this feature then entropy remaining after the data set as a classification criterion, with max entropy original dataset ( - current entropy data set), find the maximum value of each divided

Guess you like

Origin www.cnblogs.com/lalalatianlalu/p/11321684.html