Distributed K-FAC Preconditioner for PyTorch
-
Updated
May 14, 2024 - Python
Distributed K-FAC Preconditioner for PyTorch
Learning Network using Hessian Optimization in PyTorch
PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
A curated list of resources for second-order stochastic optimization methods in ML
Pytorch implementation of preconditioned stochastic gradient descent (affine group preconditioner, low-rank approximation preconditioner and more)
FOSI library for improving first order optimizers with second order information
This directory contains the source code of the experiments as shown in our main paper. It is still work in progress.
PyTorch implementation of the Hessian-free optimizer
An accelerated active‑set algorithm for a quadratic semidefinite program with general constraints
Tensorflow implementation of preconditioned stochastic gradient descent
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
The repository contains code to reproduce the experiments from our paper Error Feedback Can Accurately Compress Preconditioners available below:
Numpy implementation of Neural Networks with SGDM, ADAM and BFGS solvers, suitable for surface fitting
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
Matrix-multiplication-only KFAC; Code for ICML 2023 paper on Simplifying Momentum-based Positive-definite Submanifold Optimization with Applications to Deep Learning
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
Subsampled Riemannian trust-region (RTR) algorithms
NG+: A new second-order optimizer for deep learning
Second-Order Convergence of Alternating Minimizations
Federated Learning using PyTorch. Second-Order for Federated Learning. (IEEE Transactions on Parallel and Distributed Systems 2022)
Add a description, image, and links to the second-order-optimization topic page so that developers can more easily learn about it.
To associate your repository with the second-order-optimization topic, visit your repo's landing page and select "manage topics."