Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use custom .onnx or .tflite model with inference ? #1824

Open
RamooIsTaken opened this issue Apr 5, 2024 · 1 comment
Open

How to use custom .onnx or .tflite model with inference ? #1824

RamooIsTaken opened this issue Apr 5, 2024 · 1 comment

Comments

@RamooIsTaken
Copy link

I have a custom model that I trained using the tensorflow API. I keep this model in both .onnx format and .tflite format. Which of these can I use with jetson.inference to increase the performance on my Jetson card?

@RamooIsTaken
Copy link
Author

RamooIsTaken commented Apr 15, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant