MACHINE LEARNING RECIPES
DATA CLEANING PYTHON
DATA MUNGING
PANDAS CHEATSHEET
ALL TAGS
# What is the use of activation functions in keras ?

# What is the use of activation functions in keras ?

This recipe explains what is the use of activation functions in keras

Activation Functions in Keras

An activation function is a mathematical **gate** in between the input feeding the current neuron and its output going to the next layer. It can be as simple as a step function that turns the neuron output on and off, depending on a rule or threshold. Or it can be a transformation that maps the input signals into output signals that are needed for the neural network to function.

3 Types of Activation Functions 1. Binary Step Function 2. Linear Activation Function 3. Non-Linear Activation Functions

Activation Functions futher divided into sub parts that we are familiar with. 1. Sigmoid / Logistic 2. TanH / Hyperbolic Tangent 3. ReLU (Rectified Linear Unit) 4. Leaky ReLU 5. Parametric ReLU 6. Softmax 7. Swish 8. Softplus

```
import tensorflow as tf
from tensorflow.keras import layers
from tensorflow.keras import activations
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
from tensorflow.keras import layers
```

Defining the model and then define the layers, kernel initializer, and its input nodes shape.

```
#Model
model = Sequential()
model.add(layers.Dense(64, kernel_initializer='uniform', input_shape=(10,)))
```

We will show you how to use activation functions on some models to works

```
a = tf.constant([-200, -10, 0.0, 10, 200], dtype = tf.float32)
b= tf.keras.activations.relu(a).numpy()
print(b)
```

[ 0. 0. 0. 10. 200.]

```
a = tf.constant([-200, -10.0, 0.0, 10.0, 200], dtype = tf.float32)
b = tf.keras.activations.sigmoid(a).numpy()
b
```

array([0.000000e+00, 4.539993e-05, 5.000000e-01, 9.999546e-01, 1.000000e+00], dtype=float32)

```
a = tf.constant([-200, -10.0, 0.0, 10.0, 200], dtype = tf.float32)
b = tf.keras.activations.softplus(a)
b.numpy()
```

array([0.0000000e+00, 4.5398901e-05, 6.9314718e-01, 1.0000046e+01, 2.0000000e+02], dtype=float32)

```
a = tf.constant([-200.0,-10.0, 0.0,10.0,200.0], dtype = tf.float32)
b = tf.keras.activations.tanh(a)
b.numpy()
```

array([-1., -1., 0., 1., 1.], dtype=float32)

We passed the same input to all the activation functions to get the different outputs. So, we can easily understand and observe the difference between all the activation functions easily.

Machine Learning Project - Work with KKBOX's Music Recommendation System dataset to build the best music recommendation engine.

This project analyzes a dataset containing ecommerce product reviews. The goal is to use machine learning models to perform sentiment analysis on product reviews and rank them based on relevance. Reviews play a key role in product recommendation systems.

Machine Learning Project in R-Detect fraudulent click traffic for mobile app ads using R data science programming language.

In this machine learning project, you will uncover the predictive value in an uncertain world by using various artificial intelligence, machine learning, advanced regression and feature transformation techniques.

Machine Learning Project in R- Predict the customer churn of telecom sector and find out the key drivers that lead to churn. Learn how the logistic regression model using R can be used to identify the customer churn in telecom dataset.

In this project, we are going to work on Deep Learning using H2O to predict Census income.

The goal of this data science project is to build a predictive model and find out the sales of each product at a given Big Mart store.

In this human activity recognition project, we use multiclass classification machine learning techniques to analyse fitness dataset from a smartphone tracker.

In this data science project in R, we are going to talk about subjective segmentation which is a clustering technique to find out product bundles in sales data.

In this machine learning project, we will use binary leaf images and extracted features, including shape, margin, and texture to accurately identify plant species using different benchmark classification techniques.