Skip to content

A Machine Learning library for Neural Networks fully written in python. It supports multiple layers of neurons and offers a variety of activation functions, optimization algorithms, and utility functions.

Notifications You must be signed in to change notification settings

Belhoussine/NeuralNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Machine Learning Library

Usage Notes:

Python3 is needed to use this library.

git clone https://github.com/Belhoussine/NeuralNet
cd NeuralNet
pip3 install requirements.txt

Neural Network Specifications

1. Artificial Neural Network:

  • Supports multiple layers
  • Supports multiple neurons per layer
  • Train:
    • Forward Propagation
    • Back Propagation
    • Run in Epochs
    • Supports mini batches
  • Predict
  • Verbose training phase

2. Activation Functions:

  • Sigmoid (Non linear mapping between 0 and 1)
  • Softmax (Non Linear Probability Distribution)
  • ReLU (Rectified Linear Unit)
  • Leaky ReLU (Leaking ReLU on negative values)
  • TanH (Hyperbolic Tangent)
  • ELU (Exponential Linear Unit)

3. Loss Functions:

  • RMSE (Root Mean Squared Error)
  • MSE (Mean Squared Error)
  • SSE (Sum Squared Error)
  • MAE (Mean Absolute Error)
  • LogCosH (Log of Hyperbolic cosine)
  • Huber (Hyperbolic Tangent)
  • Cross Entropy (Logistic Loss)
  • Least Squares

3. Optimization Algorithms:

  • Batch Gradient Descent
  • SGD (Stochastic Gradient Descent)
  • Mini-Batch Gradient Descent
  • General Purpose Gradient Descent
  • ADAM (Adaptive Moment Estimation)
  • RMSProp

4. Utility Functions:

  • Download MNIST dataset from remote server
  • Flatten (Convert 2D Matrix to vector)
  • One Hot Encoding (Convert numerical to categorical)
  • One Hot Decoding (Convert categorical to numerical)
  • Normalization Function (Linear Mapping between 0 and 1)
  • Accurary function (Compute Model Accuracy)
  • Activate (Applies given activation function)
  • Compute Loss (with chosen loss function)
  • Optimize (Applies given optimizer on model)
  • Shuffle (Shuffles training data)

About

A Machine Learning library for Neural Networks fully written in python. It supports multiple layers of neurons and offers a variety of activation functions, optimization algorithms, and utility functions.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages