Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

[FEATURE] save optimizer and amp state into checkpoint #562

Open
daden-ms opened this issue Feb 18, 2020 · 0 comments
Open

[FEATURE] save optimizer and amp state into checkpoint #562

daden-ms opened this issue Feb 18, 2020 · 0 comments
Labels
enhancement New feature or request

Comments

@daden-ms
Copy link
Contributor

Description

Currently, in the common.py for transformer models, a checkpoint only saves model state and the optimizer and amp state info is not saved. We can consider saving this info like in
https://github.com/NVIDIA/apex#checkpointing

Expected behavior with the suggested feature

Other Comments

@daden-ms daden-ms added the enhancement New feature or request label Feb 18, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant