You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a custom model that I trained using the tensorflow API. I keep this model in both .onnx format and .tflite format. Which of these can I use with jetson.inference to increase the performance on my Jetson card?
The text was updated successfully, but these errors were encountered:
First of all, thank you for your response and interest.
What do you mean by related files? What exactly do I need to change in which file?
Thank you in advance for your answer, have a nice day.
________________________________
Gönderen: maynp30 ***@***.***>
Gönderildi: 10 Nisan 2024 Çarşamba 15:26
Kime: dusty-nv/jetson-inference ***@***.***>
Bilgi: Remzi ***@***.***>; Author ***@***.***>
Konu: Re: [dusty-nv/jetson-inference] How to use custom .onnx or .tflite model with inference ? (Issue #1824)
I think it is impossible to do without changing the code. But you can manually open your model, via changing corresponding files.
—
Reply to this email directly, view it on GitHub<#1824 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/A333M7DK2FMZW62PTO5JSADY4UVV3AVCNFSM6AAAAABFY7M4XWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANBXGQYDQMRVGQ>.
You are receiving this because you authored the thread.Message ID: ***@***.***>
I have a custom model that I trained using the tensorflow API. I keep this model in both .onnx format and .tflite format. Which of these can I use with jetson.inference to increase the performance on my Jetson card?
The text was updated successfully, but these errors were encountered: