Skip to content

201419/Optimizer-PyTorch

Repository files navigation

Optimizer-PyTorch

Package of Optimizer implemented with PyTorch .

Optimizer Lists

SGD: stochastic gradient descent

Adam: A Method for Stochastic Optimization

Adabound: Adaptive Gradient Methods with Dynamic Bound of Learning Rate

RAdam: On the Variance of the Adaptive Learning Rate and Beyond

Lookahead: Lookahead Optimizer: k steps forward, 1 step back

Optimistic

OptimAdam

OMD

ExtraGradient

STORM: STOchastic Recursive Momentum

Others

About

Package of Optimizer implemented with PyTorch .

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages