Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference Taking Forever #50

Open
pentanol2 opened this issue Sep 10, 2022 · 2 comments
Open

Inference Taking Forever #50

pentanol2 opened this issue Sep 10, 2022 · 2 comments

Comments

@pentanol2
Copy link

I am trying to deblur a 150 frames video using a machine having two NVIDIA RTX A5000 GPUs using the the GoPro delur model and I reduced the tile value. But this operation is taking forever. How to solve this ? Is NVIDIA RTX A5000 enough to make ineferece ?

@asrlhhh
Copy link

asrlhhh commented Oct 15, 2022

same problem for A6000

@santurini
Copy link

Same here for super-resolution. What settings have been used during inference to obtain paper's 243 ms runtime?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants