Skip to content

Siddhipatade/Activation-function

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 

Repository files navigation

Activation-function

The activation function decides whether a neuron should be activated or not by calculating the weighted sum and further adding bias to it. The purpose of the activation function is to introduce non-linearity into the output of a neuron.

The neural network has neurons that work in correspondence with weight, bias, and their respective activation function. In a neural network, update the weights and biases of the neurons on the basis of the error at the output.

This process is known as Back-propagation. Activation functions make the back-propagation possible since the gradients are supplied along with the error to update the weights and biases. The activation function does the non-linear transformation to the input making it capable to learn and perform more complex tasks.

Variants of Activation Function

1. Linear Function

Equation : Linear function has the equation similar to as of a straight line i.e. y = x

Range : -inf to +inf

image

2. Sigmoid Function

It is a function which is plotted as ‘S’ shaped graph. Equation :

A = 1/(1 + e-x) 

Value Range : 0 to 1 Uses : Usually used in output layer of a binary classification, where result is either 0 or 1. image

3. Tanh Function

Equation :

image

Value Range :- -1 to +1 Nature :- non-linear Uses :- Usually used in hidden layers of a neural network as it’s values lies between -1 to 1.

image

4. RELU Function

It Stands for Rectified linear unit. It is the most widely used activation function. implemented in hidden layers of Neural network. Equation :

A(x) = max(0,x). 

It gives an output x if x is positive and 0 otherwise.

Value Range :- [0, inf)

image

5. Softmax Function

The softmax function is also a type of sigmoid function but is handy when we are trying to handle multi- class classification problems. Nature :- non-linear

image