Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

converting scaled yolov4-scp from onnx to tensorrt #31

Open
VladBlat opened this issue Aug 3, 2021 · 0 comments
Open

converting scaled yolov4-scp from onnx to tensorrt #31

VladBlat opened this issue Aug 3, 2021 · 0 comments

Comments

@VladBlat
Copy link

VladBlat commented Aug 3, 2021

Hello

Im trying to convert onnx model with dynamic batch size created from darknet (https://github.com/WongKinYiu/ScaledYOLOv4)
to tensorrt engine. I need to create calibrated int8 engine with static batch size 2.

I use
python onnx_to_tensorrt.py -o int8_2.trt.engine --fp16 --int8 --calibration-data /data/ultra/trt/calib_ds -p preprocess_yolo -v --explicit-batch

But have inference problems with calibrated engine with batch size 2. Int 8 engine with batch size 1 and float16 engine with batch size 2 works correctly.

Where did this problem come from and how to solve it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant