Skip to content

ymanzi/neuralnetwork

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

Neural Network Lib

Member: 🌜 Ymanzi 🌛

Challenge

Implement a Multilayer Perceptron Library from scratch

Structure

struct

Perceptron (FeedForward)

perceptron

Backpropagation Equations

1

2

3

Formules

Loss Functions Implemented

  • Cross Entropy
  • Means Squared Error

Activation Functions Implemented

  • Sigmoid
  • Tanh
  • ReLU
  • SoftMax

atab

Regularization Implemented

Regularization is a technique which makes slight modifications to the learning algorithm such that the model generalizes better. This in turn improves the model's performance on the unseen data as well, avoiding overfitting.

  • L1/L2 Regularization : L1/L2 regularization try to reduce the possibility of overfitting by keeping the values of the weights and biases small.
  • Dropout : To apply DropOut, we randomly select a subset of the units and clamp their output to zero, regardless of the input; this effectively removes those units from the model.

dropout

  • Dropout Connect : We apply dropout with the weights, instead of nodes

dropout

Resources

Releases

No releases published

Packages

No packages published

Languages