The main goal of this project is to implement the well-known backpropagation algorithm in an easy manner based on the idea of the F-adjoint propagation.
-
Updated
Mar 27, 2024 - Jupyter Notebook
The main goal of this project is to implement the well-known backpropagation algorithm in an easy manner based on the idea of the F-adjoint propagation.
This a repo for the projects I completed during my Deep Learning Udacity Nanodegree.
Neural backpropagation with examples and training (Java)
Simple Multi-Layer Perceptron(MLP) with forward and backward propagation
Implementation of Back Propagation algorithm along with its variants such as RProp and QuickProp.
A machine learning library written in C. From scratch, zero dependencies (except for the C standard library).
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
Coursework on Neural Networks for the Μ124 - Machine Learning course, NKUA, Fall 2022.
Demonstration of the mini-lab (practical) component activities conducted for the course of Neural Networks and Deep Learning (19CSE456).
Trabajos Prácticos de la materia Redes Neuronales de la Universidad de Buenos Aires, cursada el primer cuatrimestre de 2023.
[RU] Обучение многослойного перцептрона с одним скрытым слоем методом обратного распространения ошибки. [EN] Training of a multilayer perceptron with one hidden layer by the back-propagating errors method.
Automatic backpropagation implemented in numpy,
Реализация алгоритма обратного распространения ошибки для обучения нейронной сети для распознавания рукописных цифр
Created with CodeSandbox
Mapping Spike Activities with Multiplicity, Adaptability, and Plasticity into Bio-Plausible Spiking Neural Networks
A TensorFlow-inspired neural network library built from scratch in C# 7.3 for .NET Standard 2.0, with GPU support through cuDNN
Standard neural network implementation
A neural network to predict daily bike rental ridership from the given dataset. Decions made on the data analysis and visualization results.
🤖 Artificial intelligence (neural network) proof of concept to solve the classic XOR problem. It uses known concepts to solve problems in neural networks, such as Gradient Descent, Feed Forward and Back Propagation.
Training the fully connected neural network (FCNN) using different optimizers for the backpropagation algorithm and compare the number of epochs that it takes for convergence along with their classification performance. Also building an autoencoder to obtain the hidden representation and use it for classification.
Add a description, image, and links to the backpropagation-algorithm topic page so that developers can more easily learn about it.
To associate your repository with the backpropagation-algorithm topic, visit your repo's landing page and select "manage topics."