-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
trt8.0 in jestion convert success and trt8.6 in x86 convert failed!!! #3851
Comments
i try to increase the opset verion and use onnx-simpy but no use |
You can try to use latest version of tensorrt. |
i want ask why low-version trt can export ,however high-version can not |
I means you can try to use latest version of tensorrt at NOTE: Different arch of OS and GPUs, the cuda-x libs are different. |
usually trt on jeston NX is diffcult to upgrade because deepstream version. I mainly want to export trt on mypc(win11 trt8.5.1.7) and my server(ubuntu 20.04 trt8.6), can you give me some advice for this error? i read over 3 issues talk about this error but no standard response~thanks your response! |
Would be great if you can try TRT 10, if still fails, please provide a reproduce. |
my onnx convert success in jeston NX with trt8.0 and infer good~
but when i use same file convert in 3090(x86_64) with trt8.6(ubuntu20.04) convert failed ,and i also try it with trt8.5 in win11, failed too. The error is
The text was updated successfully, but these errors were encountered: