Back Propagation, Python
-
Updated
Jun 28, 2011 - Python
Back Propagation, Python
Tanh hidden activations, softmax outputs and cross-entropy error.
A basic implementation of a neural network in java, with back-propagation. Created for learning purposes.
MATLAB implementations of a variety of machine learning/signal processing algorithms.
Efficiently performs automatic differentiation on arbitrary functions. Basically a rudimentary version of Tensorflow.
Implementation of neural networks from scratch using Python
Java implementarion for a Backpropagation Feedforward Neural Network with more than one hidden layer
This is my first Backpropagation Neural Network program in Processing (Java).
Implementation of a multilayer, feed-forward, fully-connected neural network trained using the gradient-descent based backpropagation algorithm
A.I. Backpropagation
Neural network/Back Propagation implemented from scratch for MNIST.从零开始实现神经网络和反向传播算法,识别MNIST
it is simple 2 layer neural network using only numpy as dependency
Implementing the backpropagation algorithm for Neural Networks
MNIST Classification using Neural Network and Back Propagation. Written in Python and depends only on Numpy
Add a description, image, and links to the backpropagation-algorithm topic page so that developers can more easily learn about it.
To associate your repository with the backpropagation-algorithm topic, visit your repo's landing page and select "manage topics."