On the Variance of the Adaptive Learning Rate and Beyond
-
Updated
Jul 31, 2021 - Python
On the Variance of the Adaptive Learning Rate and Beyond
Educational deep learning library in plain Numpy.
Google Street View House Number(SVHN) Dataset, and classifying them through CNN
This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"
Short description for quick search
A collection of various gradient descent algorithms implemented in Python from scratch
A compressed adaptive optimizer for training large-scale deep learning models using PyTorch
ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
Implement different variants of gradient descent in python using numpy
PyTorch/Tensorflow solutions for Stanford's CS231n: "CNNs for Visual Recognition"
The project aimed to implement Deep NN / RNN based solution in order to develop flexible methods that are able to adaptively fillin, backfill, and predict time-series using a large number of heterogeneous training datasets.
This library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Modified XGBoost implementation from scratch with Numpy using Adam and RSMProp optimizers.
Lion and Adam optimization comparison
Reproducing the paper "PADAM: Closing The Generalization Gap of Adaptive Gradient Methods In Training Deep Neural Networks" for the ICLR 2019 Reproducibility Challenge
Solution to Kaggle's Digit Recognizer on MNIST dataset
Implementation of Adam Optimization algorithm using Numpy
[Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder
A project I made to practice my newfound Neural Network knowledge - I used Python and Numpy to train a network to recognize MNIST images. Adam and mini-batch gradient descent implemented
Add a description, image, and links to the adam-optimizer topic page so that developers can more easily learn about it.
To associate your repository with the adam-optimizer topic, visit your repo's landing page and select "manage topics."