How to use XgBoost Classifier and Regressor in Python?

This recipe helps you use XgBoost Classifier and Regressor in Python

Recipe Objective

Have you ever tried to use XGBoost models ie. regressor or classifier. In this we will using both for different dataset.

So this recipe is a short example of how we can use XgBoost Classifier and Regressor in Python.

Access House Price Prediction Project using Machine Learning with Source Code

Step 1 - Import the library

from sklearn import datasets from sklearn import metrics from sklearn.model_selection import train_test_split import matplotlib.pyplot as plt import seaborn as sns plt.style.use("ggplot") import xgboost as xgb

Here we have imported various modules like datasets, xgb and test_train_split from differnt libraries. We will understand the use of these later while using it in the in the code snipet.
For now just have a look on these imports.

Step 2 - Setup the Data for classifier

Here we have used datasets to load the inbuilt wine dataset and we have created objects X and y to store the data and the target value respectively. dataset = datasets.load_wine() X = dataset.data; y = dataset.target X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25)

Step 3 - Model and its Score

Here, we are using XGBClassifier as a Machine Learning model to fit the data. model = xgb.XGBClassifier() model.fit(X_train, y_train) print(); print(model) Now we have predicted the output by passing X_test and also stored real target in expected_y. expected_y = y_test predicted_y = model.predict(X_test) Here we have printed classification report and confusion matrix for the classifier. print(metrics.classification_report(expected_y, predicted_y)) print(metrics.confusion_matrix(expected_y, predicted_y))

Step 4 - Setup the Data for regressor

Here we have used datasets to load the inbuilt boston dataset and we have created objects X and y to store the data and the target value respectively. dataset = datasets.load_boston() X = dataset.data; y = dataset.target X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25)

Step 5 - Model and its Score

Here, we are using XGBRegressor as a Machine Learning model to fit the data. model = xgb.XGBRegressor() model.fit(X_train, y_train) print(); print(model) Now we have predicted the output by passing X_test and also stored real target in expected_y. expected_y = y_test predicted_y = model.predict(X_test) Here we have printed r2 score and mean squared log error for the Regressor. print(metrics.r2_score(expected_y, predicted_y)) print(metrics.mean_squared_log_error(expected_y, predicted_y)) plt.figure(figsize=(10,10)) sns.regplot(expected_y, predicted_y, fit_reg=True, scatter_kws={"s": 100})

As an output we get:

XGBClassifier(base_score=0.5, booster="gbtree", colsample_bylevel=1,
       colsample_bynode=1, colsample_bytree=1, gamma=0, learning_rate=0.1,
       max_delta_step=0, max_depth=3, min_child_weight=1, missing=None,
       n_estimators=100, n_jobs=1, nthread=None,
       objective="multi:softprob", random_state=0, reg_alpha=0,
       reg_lambda=1, scale_pos_weight=1, seed=None, silent=None,
       subsample=1, verbosity=1)

              precision    recall  f1-score   support

           0       1.00      1.00      1.00        11
           1       0.94      0.94      0.94        16
           2       0.94      0.94      0.94        18

   micro avg       0.96      0.96      0.96        45
   macro avg       0.96      0.96      0.96        45
weighted avg       0.96      0.96      0.96        45


[[11  0  0]
 [ 0 15  1]
 [ 0  1 17]]


XGBRegressor(base_score=0.5, booster="gbtree", colsample_bylevel=1,
       colsample_bynode=1, colsample_bytree=1, gamma=0,
       importance_type="gain", learning_rate=0.1, max_delta_step=0,
       max_depth=3, min_child_weight=1, missing=None, n_estimators=100,
       n_jobs=1, nthread=None, objective="reg:linear", random_state=0,
       reg_alpha=0, reg_lambda=1, scale_pos_weight=1, seed=None,
       silent=None, subsample=1, verbosity=1)

0.8359074842658845

0.02822002095090446

Download Materials

What Users are saying..

profile image

Savvy Sahai

Data Science Intern, Capgemini
linkedin profile url

As a student looking to break into the field of data engineering and data science, one can get really confused as to which path to take. Very few ways to do it are Google, YouTube, etc. I was one of... Read More

Relevant Projects

AWS MLOps Project for ARCH and GARCH Time Series Models
Build and deploy ARCH and GARCH time series forecasting models in Python on AWS .

Build Regression (Linear,Ridge,Lasso) Models in NumPy Python
In this machine learning regression project, you will learn to build NumPy Regression Models (Linear Regression, Ridge Regression, Lasso Regression) from Scratch.

Build CNN Image Classification Models for Real Time Prediction
Image Classification Project to build a CNN model in Python that can classify images into social security cards, driving licenses, and other key identity information.

Walmart Sales Forecasting Data Science Project
Data Science Project in R-Predict the sales for each department using historical markdown data from the Walmart dataset containing data of 45 Walmart stores.

Learn How to Build a Linear Regression Model in PyTorch
In this Machine Learning Project, you will learn how to build a simple linear regression model in PyTorch to predict the number of days subscribed.

Build Customer Propensity to Purchase Model in Python
In this machine learning project, you will learn to build a machine learning model to estimate customer propensity to purchase.

Build a Review Classification Model using Gated Recurrent Unit
In this Machine Learning project, you will build a classification model in python to classify the reviews of an app on a scale of 1 to 5 using Gated Recurrent Unit.

Model Deployment on GCP using Streamlit for Resume Parsing
Perform model deployment on GCP for resume parsing model using Streamlit App.

Build a Credit Default Risk Prediction Model with LightGBM
In this Machine Learning Project, you will build a classification model for default prediction with LightGBM.

Digit Recognition using CNN for MNIST Dataset in Python
In this deep learning project, you will build a convolutional neural network using MNIST dataset for handwritten digit recognition.