Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPU usage with Large Data during prediction #62

Open
moroclash opened this issue Sep 16, 2021 · 1 comment
Open

GPU usage with Large Data during prediction #62

moroclash opened this issue Sep 16, 2021 · 1 comment

Comments

@moroclash
Copy link

I have a performance issue with using your model, this repo provides the only way to using the model, through passing a text and the aspects which need to assess, and it works great, but what about if you have a huge number of samples, this library causes a bottleneck in the process, I have used it to process a lot of samples, and I couldn't exceed 15% of GPU performance, and it thems that doesn't parallelize the processing, and doesn't provide any way for doing batch processing.

Is there any way for making it faster, or doing batch processing instead of feeding samples one by one? I really appreciate your suggestions.

@xesaad
Copy link

xesaad commented Nov 12, 2021

@moroclash unfortunately not an answer, but a question from another user looking for some advice to run this model on GPU with a very large amount of data. Even though you didn't see much speedup with GPU, was it relatively easy to run the package on GPU? As far as I understand it would just involve uninstalling tensorflow and installing tensorflow-gpu instead. Or did you find any other difficulties to setting up a GPU computation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants