This repo contains model compression(using TensorRT) and documentation of running various deep learning models on NVIDIA Jetson Orin, Nano (aarch64 architectures)
-
Updated
May 26, 2024 - Makefile
This repo contains model compression(using TensorRT) and documentation of running various deep learning models on NVIDIA Jetson Orin, Nano (aarch64 architectures)
ComfyUI Depth Anything Tensorrt Custom Node (up to 5x faster)
C++/C TensorRT Inference Example for models created with Pytorch/JAX/TF
YOLOX TensorRT object detection
Deep Learning API and Server in C++14 support for Caffe, PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE
A lightweight, high-performance deep learning inference tool.
Inference code of `ogata-lab/eipl`. Control robots with machine learning models on edge computer.
C++ implementation of An Improved Association Pipeline for Multi-Person Tracking
Convert yolo models to ONNX, TensorRT add NMSBatched.
Getting started with TensorRT-LLM using BLOOM as a case study
Convert ONNX models to TensorRT engines and run inference in containerized environments
Production-ready YOLO8 Segmentation deployment with TensorRT and ONNX support for CPU/GPU, including AI model integration guidance for Unitlab Annotate.
The code of YOLOv5 inferencing with TensorRT C++ api is packaged into a dynamic link library , then called through Python.
Base on tensorrt version 8.2.4, compare inference speed for different tensorrt api.
Based on TensorRT v8.2, build network for YOLOv5-v5.0 by myself, speed up YOLOv5-v5.0 inferencing
An object tracking project with YOLOv5-v5.0 and Deepsort, speed up by C++ and TensorRT.
32 GB SD card image for Jetson Nano based on Ubuntu 20 and compatible Yolov8 Ultralytics library
Using TensorRT for Inference Model Deployment.
C++ TensorRT Implementation of NanoSAM
Add a description, image, and links to the tensorrt-inference topic page so that developers can more easily learn about it.
To associate your repository with the tensorrt-inference topic, visit your repo's landing page and select "manage topics."