Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference time on Android? #189

Open
serviceberry3 opened this issue Nov 23, 2020 · 3 comments
Open

Inference time on Android? #189

serviceberry3 opened this issue Nov 23, 2020 · 3 comments

Comments

@serviceberry3
Copy link

Hi,

You say a feed-forward inference on this model is very fast--2ms on a Titan Xp. Do you think the speed would hold up on a recent Android device (like a Google Pixel 4)? I'm looking to do real-time inference using Tensorflow Posenet and this, and I'd like a speed of 20-30 FPS.

Thanks a lot! Great work.

@serviceberry3
Copy link
Author

serviceberry3 commented Jan 13, 2021

Or do you know offhand whether Facebook's VideoPose3D would be slower or faster than your model? I tried running VideoPose3D on Android, but it's really slow; I'm wondering whether I should try your model as well...

@una-dinosauria
Copy link
Owner

Hi @serviceberry3 ,

Sorry, I don't know anything about inference on mobile devices. I'll leave the issue open in case someone else knows the answer.

Cheers,

@zuozhen
Copy link

zuozhen commented Dec 8, 2021

Or do you know offhand whether Facebook's VideoPose3D would be slower or faster than your model? I tried running VideoPose3D on Android, but it's really slow; I'm wondering whether I should try your model as well...

Hi @serviceberry3 ,
Have you tried this on Android, please?
Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants