Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does current codebase support fp16/bf16? #100

Open
jzhang38 opened this issue May 4, 2023 · 1 comment
Open

Does current codebase support fp16/bf16? #100

jzhang38 opened this issue May 4, 2023 · 1 comment

Comments

@jzhang38
Copy link

jzhang38 commented May 4, 2023

I attempted to execute the code while enabling bf16 but came across some error messages. Furthermore, I conducted a search for the keywords "bf16" or "fp16" within this repository but found no results. Could this imply that this codebase does not presently support low-precision training?

@jzhang38 jzhang38 changed the title Does current code base support fp16/bf16 Does current code base support fp16/bf16? May 4, 2023
@jzhang38 jzhang38 changed the title Does current code base support fp16/bf16? Does current codebase support fp16/bf16? May 4, 2023
@albertfgu
Copy link
Contributor

I've used mixed precision (fp16) in the past through Pytorch Lightning's automatic features. It's as simple as passing in a flag to the Trainer: https://lightning.ai/docs/pytorch/stable/common/precision_intermediate.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants