PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
-
Updated
Apr 16, 2024 - Jupyter Notebook
PyHessian is a Pytorch library for second-order based analysis and training of Neural Networks
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
FEDL-Federated Learning algorithm using TensorFlow (Transaction on Networking 2021)
Distributed K-FAC Preconditioner for PyTorch
A C++ interface to formulate and solve linear, quadratic and second order cone problems.
Tensorflow implementation of preconditioned stochastic gradient descent
This repository implements FEDL using pytorch
Pytorch implementation of preconditioned stochastic gradient descent (affine group preconditioner, low-rank approximation preconditioner and more)
LIBS2ML: A Library for Scalable Second Order Machine Learning Algorithms
Subsampled Riemannian trust-region (RTR) algorithms
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
NG+: A new second-order optimizer for deep learning
Hessian-based stochastic optimization in TensorFlow and keras
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
PyTorch implementation of the Hessian-free optimizer
Compatible Intrinsic Triangulations (SIGGRAPH 2022)
Second-Order Convergence of Alternating Minimizations
Concepts and algorithms in core learning theory
Add a description, image, and links to the second-order-optimization topic page so that developers can more easily learn about it.
To associate your repository with the second-order-optimization topic, visit your repo's landing page and select "manage topics."