-
Notifications
You must be signed in to change notification settings - Fork 146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About export model #117
Comments
Hi, I am also interested. Have you tried ? |
@NVigne-cloud Haven't tried :( Apparently, we will not wait for the solution of our question |
Currently I'm trying to convert into tensorRT. converting to ONNX seems OK but like always I think tensorRT conversion need some extra work. |
it seems that 2d RoPE
makes trouble with following error [graphShapeAnalyzer.cpp::analyzeShapes::1872] Error Code 4: Miscellaneous (IElementWiseLayer /backbone/net/blocks.2/attn/rope_glb_1/Mul: broadcast dimensions must be conformable) |
any update? converting to ONNX is OK but converting to tensorRT had returned a erros: [12/18/2023-15:05:48] [E] [TRT] ModelImporter.cpp:729: --- End node --- |
is there anyone tells me how to connvert "pth" to "onnx" |
hello, I meet the same error, have you solved it? |
Just pip install onnxslim==0.1.9.1 And
OnnxSlim-main# onnxslim model-input-no-mask.onnx slimmed_model-input-no-mask.onnx optimization --skip_fusion_patterns FusionGelu can being solved it but onnx2trt will return a new errors,I can not solved the new errors
…---Original---
From: ***@***.***>
Date: Thu, Jan 4, 2024 11:32 AM
To: ***@***.***>;
Cc: ***@***.******@***.***>;
Subject: Re: [baaivision/EVA] About export model (Issue #117)
any update? converting to ONNX is OK but converting to tensorRT had returned a erros: `[12/18/2023-15:05:48] [E] [TRT] [graph.cpp::symbolicExecute::611] Error Code 4: Internal Error (/ScatterND: an IScatterLayer cannot be used to compute a shape tensor) [12/18/2023-15:05:48] [E] [TRT] ModelImporter.cpp:726: While parsing node number 90 [Pad -> "/Pad_output_0"]: [12/18/2023-15:05:48] [E] [TRT] ModelImporter.cpp:727: --- Begin node --- [12/18/2023-15:05:48] [E] [TRT] ModelImporter.cpp:728: input: "/Div_output_0" input: "/Cast_2_output_0" input: "/Constant_36_output_0" output: "/Pad_output_0" name: "/Pad" op_type: "Pad" attribute { name: "mode" s: "constant" type: STRING }
[12/18/2023-15:05:48] [E] [TRT] ModelImporter.cpp:729: --- End node --- [12/18/2023-15:05:48] [E] [TRT] ModelImporter.cpp:732: ERROR: ModelImporter.cpp:185 In function parseGraph: [6] Invalid Node - /Pad [graph.cpp::symbolicExecute::611] Error Code 4: Internal Error (/ScatterND: an IScatterLayer cannot be used to compute a shape tensor) `
hello, I meet the same error, have you solved it?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you commented.Message ID: ***@***.***>
|
Hello, I have a question related to exporting a model. Ultimately, I'm interested in the TensorRT (.trt or .engine) format, but I'm also interested in ONNX (.onnx). Then from ONNX I will be able to make the TensorRT engine myself using trtexec from NVIDIA.
My question is, is it possible to export a trained model to ONNX?
The text was updated successfully, but these errors were encountered: