Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BlazePose doesn't work with quantized Int8 model #33

Open
enricoantonini84 opened this issue Aug 12, 2022 · 1 comment
Open

BlazePose doesn't work with quantized Int8 model #33

enricoantonini84 opened this issue Aug 12, 2022 · 1 comment

Comments

@enricoantonini84
Copy link

Hi,
I've tried to run, on Debian 11 following the installation instruction, the script that performs pose detection w BlazePose on the included images and it works fine. However, if I try to run the script with -q parameters to load the Int8 model, no pose is detected on any images.
Is there anything else I should do to make the detection work with the quantized model?

Regards
Enrico

@terryky
Copy link
Owner

terryky commented Aug 13, 2022

Yes, you are right.

Unfortunately, some quantized models included in this repository may be low accuracy.
I think there are several techniques to improve accuracy, but I am not familiar with them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants