How does automatic differentiation work in MXNet?

In this recipe, we will see how does automatic differentiation work in MXNet with an example

Recipe Objective: How does automatic differentiation work in MXNet? Explain with an example

This recipe explains how does automatic differentiation work in MXNet.

Access Retail Price Recommendation ML Project with Source Code

Step 1: Importing library

Let us first import the necessary libraries. We'll import mxnet, autograd, and ndarray (nd as short alias) packages.

import mxnet as mx
from mxnet import nd
from mxnet import autograd

Step 2: Basic functionality

For example, let's assume we want to differentiate a function f(a)=(4b^2)+(2*b) with respect to a. Now, we'll store the gradient of f(a) by invoking its attach_grad method. After that, we'll define f(a)=(4b^2)+(2*b) inside a autograd.record() to allow MXNet store y. By invoking b.backward(), we'll call backdrop. Now well cross-check expected output and computed output.

a = nd.array([[1,2,3], [3,4,5], [6,7,8]])
a.attach_grad()
with autograd.record():
    b = (4*a*a) + (2*a)
b.backward()
a.grad

Step 3: Dynamic Usage

For a dynamic program computation that depends on real-time values, MXNet will save and compute the gradient. The function will compute the square of input until the norm reaches 2000. We'll print the gradient of the function.

x = nd.array([[1,2,3], [3,4,5], [6,7,8]])
x.attach_grad()
with autograd.record():
    y = x * x
    while y.norm().asscalar() < 2000:
       y = y * y
    if y.sum().asscalar() >= 0:
       z = y[0]
    else:
       z = y[1]
z.backward
x.grad

What Users are saying..

profile image

Ameeruddin Mohammed

ETL (Abintio) developer at IBM
linkedin profile url

I come from a background in Marketing and Analytics and when I developed an interest in Machine Learning algorithms, I did multiple in-class courses from reputed institutions though I got good... Read More

Relevant Projects

Build a Autoregressive and Moving Average Time Series Model
In this time series project, you will learn to build Autoregressive and Moving Average Time Series Models to forecast future readings, optimize performance, and harness the power of predictive analytics for sensor data.

Build Multi Class Text Classification Models with RNN and LSTM
In this Deep Learning Project, you will use the customer complaints data about consumer financial products to build multi-class text classification models using RNN and LSTM.

Tensorflow Transfer Learning Model for Image Classification
Image Classification Project - Build an Image Classification Model on a Dataset of T-Shirt Images for Binary Classification

Build a Hybrid Recommender System in Python using LightFM
In this Recommender System project, you will build a hybrid recommender system in Python using LightFM .

Deploy Transformer BART Model for Text summarization on GCP
Learn to Deploy a Machine Learning Model for the Abstractive Text Summarization on Google Cloud Platform (GCP)

Build OCR from Scratch Python using YOLO and Tesseract
In this deep learning project, you will learn how to build your custom OCR (optical character recognition) from scratch by using Google Tesseract and YOLO to read the text from any images.

Build a Face Recognition System in Python using FaceNet
In this deep learning project, you will build your own face recognition system in Python using OpenCV and FaceNet by extracting features from an image of a person's face.

Hands-On Approach to Regression Discontinuity Design Python
In this machine learning project, you will learn to implement Regression Discontinuity Design Example in Python to determine the effect of age on Mortality Rate in Python.

MLOps AWS Project on Topic Modeling using Gunicorn Flask
In this project we will see the end-to-end machine learning development process to design, build and manage reproducible, testable, and evolvable machine learning models by using AWS

BERT Text Classification using DistilBERT and ALBERT Models
This Project Explains how to perform Text Classification using ALBERT and DistilBERT