Explain save model, load model and AutoML functions in PyCaret

In this recipe, we will see what are save model, load model and AutoML functions in the classification model and how to use them in PyCaret.

Recipe Objective - What are save_model, load_model, and automl functions in the classification model in PyCaret?

PyCaret provides save_model, load_model, and AutoML functions in the classification module.

For more related projects:-

https://www.projectpro.io/projects/data-science-projects/tensorflow-projects

https://www.projectpro.io/projects/data-science-projects/keras-deep-learning-projects

Save model function with Example:-

PyCaret provides "pycaret.classification.save_model()" function.

Save model function saves the transformation pipeline and trained model object into the current working directory as a pickle file for later use.

Learn to Implement Customer Churn Prediction Using Machine Learning in Python

from pycaret.datasets import get_data
iris = get_data('iris')
# importing classification module
from pycaret.classification import *
set_up = setup(data = iris, target = 'species')
# logistic regression
log_reg = create_model('lr')
# saving model
save_model(log_reg, 'saved_lr_model')

load_model function with Example:-

PyCaret provides "pycaret.classification.load_model()" function.

The load model function loads a previously saved pipeline.

from pycaret.classification import load_model
saved_lr = load_model('saved_lr_model')
saved_lr

AutoML function with Example:-

PyCaret provides "pycaret.classification.automl()" function.

AutoML function returns the best model out of all trained models based on the optimized parameter in the current session.

from pycaret.datasets import get_data
iris = get_data('iris')
# importing classification module
from pycaret.classification import *
set_up = setup(data = iris, target = 'species')
top_3 = compare_models(n_select = 3)
tuned_top3 = [tune_model(i) for i in top_3]
blend = blend_models(tuned_top3)
stack = stack_models(tuned_top3)
best_auc_model = automl(optimize = 'AUC')
best_auc_model

What Users are saying..

profile image

Ray han

Tech Leader | Stanford / Yale University
linkedin profile url

I think that they are fantastic. I attended Yale and Stanford and have worked at Honeywell,Oracle, and Arthur Andersen(Accenture) in the US. I have taken Big Data and Hadoop,NoSQL, Spark, Hadoop... Read More

Relevant Projects

Skip Gram Model Python Implementation for Word Embeddings
Skip-Gram Model word2vec Example -Learn how to implement the skip gram algorithm in NLP for word embeddings on a set of documents.

OpenCV Project to Master Advanced Computer Vision Concepts
In this OpenCV project, you will learn to implement advanced computer vision concepts and algorithms in OpenCV library using Python.

AWS MLOps Project for ARCH and GARCH Time Series Models
Build and deploy ARCH and GARCH time series forecasting models in Python on AWS .

PyTorch Project to Build a GAN Model on MNIST Dataset
In this deep learning project, you will learn how to build a GAN Model on MNIST Dataset for generating new images of handwritten digits.

Build a Multi Class Image Classification Model Python using CNN
This project explains How to build a Sequential Model that can perform Multi Class Image Classification in Python using CNN

Time Series Project to Build a Multiple Linear Regression Model
Learn to build a Multiple linear regression model in Python on Time Series Data

Build a Face Recognition System in Python using FaceNet
In this deep learning project, you will build your own face recognition system in Python using OpenCV and FaceNet by extracting features from an image of a person's face.

Deep Learning Project for Beginners with Source Code Part 1
Learn to implement deep neural networks in Python .

Build Customer Propensity to Purchase Model in Python
In this machine learning project, you will learn to build a machine learning model to estimate customer propensity to purchase.

Locality Sensitive Hashing Python Code for Look-Alike Modelling
In this deep learning project, you will find similar images (lookalikes) using deep learning and locality sensitive hashing to find customers who are most likely to click on an ad.