Recipe: How to visualise XGBoost feature importance in Python?
MACHINE LEARNING RECIPES

How to visualise XGBoost feature importance in Python?

This recipe helps you visualise XGBoost feature importance in Python
In [4]:
## How to visualise XGBoost feature importance in Python
## DataSet: skleran.datasets.load_breast_cancer()
def Snippet_187():
    print()
    print(format('Hoe to visualise XGBoost feature importance in Python','*^82'))
    import warnings
    warnings.filterwarnings("ignore")

    # load libraries
    from sklearn import datasets
    from sklearn import metrics
    from sklearn.model_selection import train_test_split
    from xgboost import XGBClassifier, plot_importance
    import matplotlib.pyplot as plt

    # load the iris datasets
    dataset = datasets.load_wine()
    X = dataset.data; y = dataset.target
    X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25)

    # fit a ensemble.AdaBoostClassifier() model to the data
    model = XGBClassifier()
    model.fit(X_train, y_train)
    print(); print(model)

    # make predictions
    expected_y  = y_test
    predicted_y = model.predict(X_test)

    # summarize the fit of the model
    print(); print('XGBClassifier: ')
    print(); print(metrics.classification_report(expected_y, predicted_y,
                   target_names=dataset.target_names))
    print(); print(metrics.confusion_matrix(expected_y, predicted_y))

    plt.bar(range(len(model.feature_importances_)), model.feature_importances_)
    plt.show()
    plt.barh(range(len(model.feature_importances_)), model.feature_importances_)
    plt.show()
    plot_importance(model);     plt.show()

Snippet_187()
**************Hoe to visualise XGBoost feature importance in Python***************

XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1,
       colsample_bytree=1, gamma=0, learning_rate=0.1, max_delta_step=0,
       max_depth=3, min_child_weight=1, missing=None, n_estimators=100,
       n_jobs=1, nthread=None, objective='multi:softprob', random_state=0,
       reg_alpha=0, reg_lambda=1, scale_pos_weight=1, seed=None,
       silent=True, subsample=1)

XGBClassifier:

              precision    recall  f1-score   support

     class_0       1.00      1.00      1.00        14
     class_1       1.00      1.00      1.00        16
     class_2       1.00      1.00      1.00        15

   micro avg       1.00      1.00      1.00        45
   macro avg       1.00      1.00      1.00        45
weighted avg       1.00      1.00      1.00        45


[[14  0  0]
 [ 0 16  0]
 [ 0  0 15]]


Stuck at work?
Can't find the recipe you are looking for. Let us know and we will find an expert to create the recipe for you. Click here
Companies using this Recipe