How to train a network using trainers in PyBrain

This recipe helps you train a network using trainers in PyBrain

Recipe Objective - How to train a network using trainers in PyBrain?

PyBrain supports two trainers:

1. BackpropTrainer:
BackpropTrainer is a trainer that trains the parameters of a module according to a monitored data set or ClassificationDataSet (possibly sequentially), propagating errors backward (over time).

Apply Machine Learning to Demand Forecasting Data Science Problems  

2. TrainUntilConvergence:
TrainUntilConvergence is used to train the module on the dataset until it converges.
When we create a neural network, it trains based on the training data provided to it. Whether the network is adequately trained depends on the prediction of the test data tested on that network.
Let's take a step-by-step look at a working example that builds a neural network and predicts training errors, test errors, and validation errors.

For more related projects -

https://www.projectpro.io/projects/data-science-projects/tensorflow-projects
https://www.projectpro.io/projects/data-science-projects/keras-deep-learning-projects

Let's train a network using trainers:

# Importing all the necessary libraries
from sklearn import datasets
import matplotlib.pyplot as plt
from pybrain.datasets import ClassificationDataSet
from pybrain.utilities import percentError
from pybrain.tools.shortcuts import buildNetwork
from pybrain.supervised.trainers import BackpropTrainer
from pybrain.structure.modules import SoftmaxLayer
from numpy import ravel

# Loading iris dataset from sklearn datasets
iris = datasets.load_iris()

# Defining feature variables and target variable
X_data = iris.data
y_data = iris.target

# Defining classification dataset model
classification_dataset = ClassificationDataSet(4, 1, nb_classes=3)

# Adding sample into classification dataset
for i in range(len(X_data)):
  classification_dataset.addSample(ravel(X_data[i]), y_data[i])

# Spilling data into testing and training data with the ratio 7:3
testing_data, training_data = classification_dataset.splitWithProportion(0.3)

# Classification dataset for test data
test_data = ClassificationDataSet(4, 1, nb_classes=3)

# Adding sample into testing classification dataset
for n in range(0, testing_data.getLength()):
   test_data.addSample( testing_data.getSample(n)[0], testing_data.getSample(n)[1] )

# Classification dataset for train data
train_data = ClassificationDataSet(4, 1, nb_classes=3)

# Adding sample into training classification dataset
for n in range(0, training_data.getLength()):
   train_data.addSample( training_data.getSample(n)[0], training_data.getSample(n)[1] )

test_data._convertToOneOfMany()
train_data._convertToOneOfMany()

# Building network with outclass as SoftmaxLayer on training data
build_network = buildNetwork(train_data.indim, 4, train_data.outdim, outclass=SoftmaxLayer)

# Building a backproptrainer on training data
trainer = BackpropTrainer(build_network, dataset=train_data, learningrate=0.01, verbose=True)

# 20 iterations on training data
trainer.trainEpochs(20)

# Testing data
print('Error percentage on testing data=>',percentError(trainer.testOnClassData(dataset=test_data), test_data['class']))

Output -
Total error:  0.136254200252
Total error:  0.119727684498
Total error:  0.114085052609
Total error:  0.113058640822
Total error:  0.112989040344
Total error:  0.112474362493
Total error:  0.112762618901
Total error:  0.112578593311
Total error:  0.112490481321
Total error:  0.112521126322
Total error:  0.112344921174
Total error:  0.112221230382
Total error:  0.112308115747
Total error:  0.112182370479
Total error:  0.112142111063
Total error:  0.112060371407
Total error:  0.112182410105
Total error:  0.112189876709
Total error:  0.112308227872
Total error:  0.112268829249
Error percentage on testing data=> 73.33333333333333

In this way, we can train a network using trainers in pybrain.

What Users are saying..

profile image

Abhinav Agarwal

Graduate Student at Northwestern University
linkedin profile url

I come from Northwestern University, which is ranked 9th in the US. Although the high-quality academics at school taught me all the basics I needed, obtaining practical experience was a challenge.... Read More

Relevant Projects

AWS Project to Build and Deploy LSTM Model with Sagemaker
In this AWS Sagemaker Project, you will learn to build a LSTM model on Sagemaker for sales forecasting while analyzing the impact of weather conditions on Sales.

PyTorch Project to Build a GAN Model on MNIST Dataset
In this deep learning project, you will learn how to build a GAN Model on MNIST Dataset for generating new images of handwritten digits.

Time Series Project to Build a Multiple Linear Regression Model
Learn to build a Multiple linear regression model in Python on Time Series Data

Image Segmentation using Mask R-CNN with Tensorflow
In this Deep Learning Project on Image Segmentation Python, you will learn how to implement the Mask R-CNN model for early fire detection.

Hands-On Approach to Causal Inference in Machine Learning
In this Machine Learning Project, you will learn to implement various causal inference techniques in Python to determine, how effective the sprinkler is in making the grass wet.

Azure Deep Learning-Deploy RNN CNN models for TimeSeries
In this Azure MLOps Project, you will learn to perform docker-based deployment of RNN and CNN Models for Time Series Forecasting on Azure Cloud.

Build an End-to-End AWS SageMaker Classification Model
MLOps on AWS SageMaker -Learn to Build an End-to-End Classification Model on SageMaker to predict a patient’s cause of death.

Llama2 Project for MetaData Generation using FAISS and RAGs
In this LLM Llama2 Project, you will automate metadata generation using Llama2, RAGs, and AWS to reduce manual efforts.

Credit Card Fraud Detection as a Classification Problem
In this data science project, we will predict the credit card fraud in the transactional dataset using some of the predictive models.

Build a Speech-Text Transcriptor with Nvidia Quartznet Model
In this Deep Learning Project, you will leverage transfer learning from Nvidia QuartzNet pre-trained models to develop a speech-to-text transcriptor.