Skip to content

aysebilgegunduz/FeedFwBackProp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Feed Forward Back Propagation MLP Application

Simple multi layer perceptron application using feed forward backpropagation algorithm

Parametric Variables:

  • Hidden Layers
  • Neuron Layer Count for both input-output and hidden layers
  • Activation function can be either sigmoid (1) or tanh (2) or ReLu (3)
  • Learning Rate (default will be 0.5)
  • Procedure of Weight Update can be either delta bar (1) or adaptive learning (2) or momentum (3)
  • Epoch count

Releases

No releases published

Packages

No packages published

Languages