Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minimum GPU requirements #29

Open
gusanmaz opened this issue Jul 21, 2023 · 4 comments
Open

Minimum GPU requirements #29

gusanmaz opened this issue Jul 21, 2023 · 4 comments

Comments

@gusanmaz
Copy link

I get CUDA out of memory error when I run python demo/demo.py --input demo/examples/coco.jpg --output demo/coco_pred.jpg --vocab "black pickup truck, pickup truck; blue sky, sky" on RTX 3060 GPU with 12GB of vram.

Last lines of the error is as follows:

output_features[k] = torch.zeros(
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 176.00 MiB (GPU 0; 11.73 GiB total capacity; 8.91 GiB already allocated; 136.75 MiB free; 9.09 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

What are the minimum requirements for running inference code? Is there a way to prevent getting these errors on less powerful systems? Is it possible to perform inference using CPU?

Thanks!

@tejassp2002
Copy link

Hey @gusanmaz, did you get the information on the GPU requirements? Thanks.

@cipri-tom
Copy link

yeah, they say you need at least 13 Gb of VRAM. Here's an excerpt from running ODISE on T4 GPU . 13.2 Gb

Screenshot 2023-08-07 at 10 00 59

@tejassp2002
Copy link

Hi! I can see that the Colab offers 15 GB of GPU RAM. But still whenever I run the colab code the instance crashes. Any workarounds on this?

@cipri-tom
Copy link

cipri-tom commented Aug 7, 2023

@tejassp2002 Probably because during loading the model reaches more than 12 Gb of RAM for a brief moment of time (Colab has 12 Gb), and Google provides no SWAP to accommodate this. This is system RAM, not GPU RAM

Screenshot 2023-08-07 at 14 35 32

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants