Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch to Inference #440

Open
GGital opened this issue Apr 23, 2024 · 0 comments
Open

Batch to Inference #440

GGital opened this issue Apr 23, 2024 · 0 comments

Comments

@GGital
Copy link

GGital commented Apr 23, 2024

Hello everyone, right now I am trying to inference image captioning using OFA/Huge fine-tuned on CoCo with about 48k images ,but I am facing very slow speed due to 1 image per batch ( about 1 image / sec which means I have to wait for about 13 hours to inference entire dataset). is there any way to do batch inference on my test set and still keeping beam search generation ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant