Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About batch_size during training #58

Open
zhangziwenHIT opened this issue Jan 10, 2024 · 0 comments
Open

About batch_size during training #58

zhangziwenHIT opened this issue Jan 10, 2024 · 0 comments

Comments

@zhangziwenHIT
Copy link

zhangziwenHIT commented Jan 10, 2024

You mentioned in the paper that setting batch_size==8 on four 2080Ti takes about three days to train.
image
But when I set batch_size>=4 on V100, I get an error.
image
The problem is the same as in this issue.#23
image
Is there something wrong with my configuration?As far as I know, the 2080Ti has 11GB of memory, while the V100 I used has 32GB. This is puzzling to me.Looking forward to getting your answer, thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant