We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Description triton infer server with the docker image not working on Jetson Orin NX 16 GB JP 5.1.1
See attached log server.log
Triton Information tritonserver:23.01-py3 (docker image)
To Reproduce docker run --rm --net host -it --runtime nvidia --gpus=1 -v ./triton:/models nvcr.io/nvidia/tritonserver:23.01-py3 tritonserver --model-repository=/models
docker run --rm --net host -it --runtime nvidia --gpus=1 -v ./triton:/models nvcr.io/nvidia/tritonserver:23.01-py3 tritonserver --model-repository=/models
Expected behavior tritonserver should start successfully
The text was updated successfully, but these errors were encountered:
Hi @allan-navarro, Triton's -py3-igpu* containers should support Jetpack 6.X. For JP 5.1.1, @nv-kmcgill53 do you know we have supported contaienrs?
-py3-igpu*
Sorry, something went wrong.
Mark this issue,same issue here
No branches or pull requests
Description
triton infer server with the docker image not working on Jetson Orin NX 16 GB JP 5.1.1
See attached log
server.log
Triton Information
tritonserver:23.01-py3 (docker image)
To Reproduce
docker run --rm --net host -it --runtime nvidia --gpus=1 -v ./triton:/models nvcr.io/nvidia/tritonserver:23.01-py3 tritonserver --model-repository=/models
Expected behavior
tritonserver should start successfully
The text was updated successfully, but these errors were encountered: