Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hello, i would like to know why the inference is so slow? #1

Open
zzzzzz0407 opened this issue Jan 6, 2018 · 3 comments
Open

hello, i would like to know why the inference is so slow? #1

zzzzzz0407 opened this issue Jan 6, 2018 · 3 comments

Comments

@zzzzzz0407
Copy link

i modify the input size to 416*416,but the fps is just about 13?

@cory8249
Copy link
Owner

cory8249 commented Jan 7, 2018

What is your CPU / GPU?

On my PC with 8700K + 1080Ti,

416 x 416
processed 40 images in 1.147 seconds
0.029 sec/image
34.88 fps

1216 x 352
processed 40 images in 1.526 seconds
0.038 sec/image
26.21 fps

BTW, kitti_detect.py is a simple demo code, maybe the overhead dominates in your case.
In my other test script, this model can reach ~90 fps @1216x352 batch=1

@zzzzzz0407
Copy link
Author

my gpu is Titan X, and i do not modify any code, i do not know where is wrong.

@zzzzzz0407
Copy link
Author

or if the warning is the main issue?
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants