Python machine learning: plot_importance() to view feature importance

The plot_importance() method in the lightgmb algorithm supports the viewing of feature importance. The following will take the lightgmb algorithm as an example to visualize the feature importance. In addition, the implementation of the xgboost algorithm is almost the same.

Prepare the model in advance:

import lightgbm as lgb
model_lgb = lgb.LGBMClassifier().fit(X_train, y_train)

The above model is trained, let's check the feature importance:

from lightgbm import plot_importance
fig,ax = plt.subplots(figsize=(10,8))
plot_importance(model_lgb,max_num_features=20,ax=ax)
plt.show()

Code explanation:

import Import the plot_importance package to view the feature importance in the lightgbm algorithm;

plt.subplots(figsize=(10,8)) means to generate a canvas with a length of 10 and a width of 8;

The model_lgb in plot_importance() is the function name we defined in advance, and the lightgbm algorithm is stored in it; max_num_features=20 shows the 20 features of the head;


operation result:

All features can be displayed without limiting max_num_features:

fig,ax = plt.subplots(figsize=(10,8))
plot_importance(model_lgb,ax=ax)

 The feature importance in the graph is arranged from large to small, and it is possible to intuitively understand which are the important features that affect the prediction results.

Guess you like

Origin blog.csdn.net/Sukey666666/article/details/128910157