Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

16GB-VRAM run Chat-UniVi-7B-v1.5 model? #37

Open
joseph16388 opened this issue May 13, 2024 · 1 comment
Open

16GB-VRAM run Chat-UniVi-7B-v1.5 model? #37

joseph16388 opened this issue May 13, 2024 · 1 comment

Comments

@joseph16388
Copy link

Can 16GB-VRAM run Chat-UniVi-7B-v1.5 model?
thanks

@jpthu17
Copy link
Member

jpthu17 commented May 22, 2024

16GB-VRAM doesn't seem like enough for training the model, you can try using Lora and reducing the batch size.

For model inference, 16GB-VRAM is sufficient.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants