-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error in torch2trt of inference segmentation.ipynb #1
Comments
Thanks for trying it out! |
Thank you for your reply. I installed latest commit and deeplab could not run too.
The interpolation has environment-dependent problems. NVIDIA-AI-IOT/torch2trt#274 There is no problem in your code, |
Facing the same problem, with the latest trt7 and torch2trt with plugin installation. |
I only tried segmentation with Xavier. I used I just tried running segmentations with Jetson Nano as well, but I was stuck in running the native PyTorch segmentation model. Will report if I get this working.. (Jetpack=4.1, TRT5) The steps I followed to setup Xavier is as bellow:
|
Actually, by following this setup, I was able to convert torch2trt with Jetson nano as well. Can you try building torchvision as above? I think that was the issue. |
Thank you for your reply. The facts that you can do with Jetson nano and Xavier are very valuable information!!! I'm now trying on 2080ti and ubuntu 18.04 environment, This problem seems to the generation of libtorch2trt.so.
The problem is that the following libc10.so libc10_cuda.so libtorch.so related to torch cannot be linked. This problem depends on torch version and g++ version The following issues are likely to be helpful. |
I read an article about installing torch that you told me. You may be installed torch1.1.0 I think to use torch1.1.0 until torch2trt supports torch1.4.0 |
Yes, I use torch 1.1.0 and torchvision 0.3.0 for Jetson Nano. For Xavier, I used 1.3.0 with Nvidia built binaries.
|
Xavier worked with torch 1.3.0. This is great information. Thanks for any useful information. Thanks for your contribution! |
@flow-dev It may be informative to create an issue asking an appropriate version of torch that works for torch2trt, which will help others. |
@kentaroy47 |
Reading all your feedbacks. |
Thanks for the comments. |
Not working in the following environments. in my case. ubuntu18.04 JEtPack4.3 I am trying to build Jetson Nano now. |
@flow-dev |
Yes. I can simply pip installed. |
@flow-dev @kentaroy47 |
@hive-cas |
@kentaroy47 |
It is really weird that Linking fails in amd64 Ubuntu 18.04 servers, since they should behave as same as Xavier hardware.. |
Thanks for sharing great code!
However, I am having trouble getting an error when converting deeplabv3 models with torch2trt.
--> "inference segmentation.ipynb"
The backbone alone such as resnet18 can be executed without problems.
-->python3 inference_tensorrt.py
Which torch2trt installation method or jetpack version are you using?
The environment is Jetpack4.2 and Jetson nano
I installed with "Option 2 - With plugins (experimental)" referring to this site.(https://github.com/NVIDIA-AI-IOT/torch2trt)
Err log of "inference_segmentation.ipynb"
"inference_tensorrt.py" is no problem.
I hope you get good advice.
The text was updated successfully, but these errors were encountered: