Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adai Optimizer ICML2022 Oral: adaptive momentum hyperparameters instead of adaptive learning rate #476

Open
zeke-xie opened this issue Nov 5, 2022 · 0 comments

Comments

@zeke-xie
Copy link

zeke-xie commented Nov 5, 2022

Hi,

Our ICML2022 Oral paper proposed a novel adaptive optimization method named Adaptive Inertia (Adai), which uses parameter-wise inertia (the momentum hyperparameter as a vector) to accelerate saddle-point escaping and provably select flat minima as well as SGD.

Adai combines the advantages of Adam and SGD on saddle-point escaping and minima selection, respectively.

The official implementation can be found at https://github.com/zeke-xie/adaptive-inertia-adai
This may be helpful for your project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant