Skip to content

Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)

License

Notifications You must be signed in to change notification settings

SameetAsadullah/Neural-Network-Implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Neural Network Implementation

Description

Neural Network implemented with different Activation Functions, Optimizers, and Loss Functions.

Activation Functions

  • Sigmoid
  • Relu
  • Leaky-Relu
  • Softmax

Optimizers

  • Gradient Descent
  • AdaGrad
  • RMSProp
  • Adam

Loss Functions

  • Cross-Entropy Loss
  • Hinge-Loss
  • Mean Squared Error (MSE)

Contributors

About

Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published