Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-machine training #1

Open
launchauto opened this issue May 13, 2021 · 0 comments
Open

Multi-machine training #1

launchauto opened this issue May 13, 2021 · 0 comments

Comments

@launchauto
Copy link

Thanks for your work!
As shown in the markdown file, we can now pretrain Transformer-SSL via 8 GPUs and 1 node.
Do you have scripts for multi-machine training? I want to pretrain it via 64 GPUs on 8 machines.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant