Back to blog
← View series: ibm ai engineering

~/blog

7.5.1activationfuction_v2

Apr 1, 20262 min readBy Mohammed Vasim
AIMachine LearningLLMPyTorchTensorFlowGenerative AILangChainAI Agents

Activation Functions

Objective

  • How to apply different Activation functions in Neural Network.

Table of Contents

In this lab, you will cover logistic regression by using PyTorch.

Estimated Time Needed: 15 min


We'll need the following libraries

python
# Import the libraries we need for this lab

import torch.nn as nn
import torch

import matplotlib.pyplot as plt
torch.manual_seed(2)

Logistic Function

Create a tensor ranging from -10 to 10:

python
# Create a tensor

z = torch.arange(-10, 10, 0.1,).view(-1, 1)

When you use sequential, you can create a sigmoid object:

python
# Create a sigmoid object

sig = nn.Sigmoid()

Apply the element-wise function Sigmoid with the object:

python
# Make a prediction of sigmoid function

yhat = sig(z)

Plot the results:

python
# Plot the result

plt.plot(z.detach().numpy(),yhat.detach().numpy())
plt.xlabel('z')
plt.ylabel('yhat')

For custom modules, call the sigmoid from the torch (nn.functional for the old version), which applies the element-wise sigmoid from the function module and plots the results:

python
# Use the build in function to predict the result

yhat = torch.sigmoid(z)
plt.plot(z.numpy(), yhat.numpy())

plt.show()

Tanh

When you use sequential, you can create a tanh object:

python
# Create a tanh object

TANH = nn.Tanh()

Call the object and plot it:

python
# Make the prediction using tanh object

yhat = TANH(z)
plt.plot(z.numpy(), yhat.numpy())
plt.show()

For custom modules, call the Tanh object from the torch (nn.functional for the old version), which applies the element-wise sigmoid from the function module and plots the results:

python
# Make the prediction using the build-in tanh object

yhat = torch.tanh(z)
plt.plot(z.numpy(), yhat.numpy())
plt.show()

Relu

When you use sequential, you can create a Relu object:

python
# Create a relu object and make the prediction

RELU = nn.ReLU()
yhat = RELU(z)
plt.plot(z.numpy(), yhat.numpy())

For custom modules, call the relu object from the nn.functional, which applies the element-wise sigmoid from the function module and plots the results:

python
# Use the build-in function to make the prediction

yhat = torch.relu(z)
plt.plot(z.numpy(), yhat.numpy())
plt.show()

Compare Activation Functions

python
# Plot the results to compare the activation functions

x = torch.arange(-2, 2, 0.1).view(-1, 1)
plt.plot(x.numpy(), torch.relu(x).numpy(), label='relu')
plt.plot(x.numpy(), torch.sigmoid(x).numpy(), label='sigmoid')
plt.plot(x.numpy(), torch.tanh(x).numpy(), label='tanh')
plt.legend()

Practice

Compare the activation functions with a tensor in the range (-1, 1)

python
# Practice: Compare the activation functions again using a tensor in the range (-1, 1)

# Type your code here

Double-click here for the solution.

About the Authors:

Joseph Santarcangelo has a PhD in Electrical Engineering, his research focused on using machine learning, signal processing, and computer vision to determine how videos impact human cognition. Joseph has been working for IBM since he completed his PhD.

Other contributors: Michelle Carey, Mavis Zhou


© IBM Corporation. All rights reserved.

Comments (0)

No comments yet. Be the first to comment!

Leave a comment