Regularization, Bayesian Model Selection and k-fold Cross-Validation Selection
-
Updated
Dec 27, 2019 - Python
Regularization, Bayesian Model Selection and k-fold Cross-Validation Selection
Newton’s second-order optimization methods in python
Numpy implementation of Neural Networks with SGDM, ADAM and BFGS solvers, suitable for surface fitting
Solve DNN relaxations of nonconvex quadratic programming problems.
Adaptive Linesearch Algorithm
Prototyping of matrix free Newton methods in Julia
The repository contains code to reproduce the experiments from our paper Error Feedback Can Accurately Compress Preconditioners available below:
This directory contains the source code of the experiments as shown in our main paper. It is still work in progress.
Discussion of advantages and disadvantages of AdaHessian, a state-of-the-art Second Order Methods over First Order Methods on a Non-Convex Optimization Problem (digits classification on MNIST database using ResNet18). - @ EPFL
An accelerated active‑set algorithm for a quadratic semidefinite program with general constraints
Learning Network using Hessian Optimization in PyTorch
A curated list of resources for second-order stochastic optimization methods in ML
FOSI library for improving first order optimizers with second order information
Concepts and algorithms in core learning theory
Federated Learning using PyTorch. Second-Order for Federated Learning. (IEEE Transactions on Parallel and Distributed Systems 2022)
Matrix-multiplication-only KFAC; Code for ICML 2023 paper on Simplifying Momentum-based Positive-definite Submanifold Optimization with Applications to Deep Learning
An efficient and easy-to-use Theano implementation of the stochastic Gauss-Newton method for training deep neural networks.
Second-Order Convergence of Alternating Minimizations
Hessian-based stochastic optimization in TensorFlow and keras
Add a description, image, and links to the second-order-optimization topic page so that developers can more easily learn about it.
To associate your repository with the second-order-optimization topic, visit your repo's landing page and select "manage topics."