Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Export model to ONNX #24

Open
suyuzhang opened this issue Feb 18, 2021 · 1 comment
Open

Export model to ONNX #24

suyuzhang opened this issue Feb 18, 2021 · 1 comment

Comments

@suyuzhang
Copy link

suyuzhang commented Feb 18, 2021

Hi

I try to convert the lite-transformer model to ONNX, but I catch a lot of problems during this process.
I can't move forward with errors. Does anybody have a positive experience in export this model to ONNX?

Error message:

Traceback (most recent call last):
  File "generate.py", line 202, in <module>
    cli_main()
  File "generate.py", line 198, in cli_main
    main(args)
  File "generate.py", line 110, in main
    torch.onnx.export(model, args=(dummy_1, dummy_3, dummy_2), f='output.onnx', keep_initializers_as_inputs=True, opset_version=9, operator_export_type=torch.onnx.OperatorExportTypes.ONNX_ATEN_FALLBACK)
  File "/opt/conda/lib/python3.6/site-packages/torch/onnx/__init__.py", line 230, in export
    custom_opsets, enable_onnx_checker, use_external_data_format)
  File "/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py", line 92, in export
    use_external_data_format=use_external_data_format)
  File "/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py", line 538, in _export
    fixed_batch_size=fixed_batch_size)
  File "/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py", line 374, in _model_to_graph
    graph, torch_out = _trace_and_get_graph_from_model(model, args)
  File "/opt/conda/lib/python3.6/site-packages/torch/onnx/utils.py", line 327, in _trace_and_get_graph_from_model
    torch.jit._get_trace_graph(model, args, strict=False, _force_outplace=False, _return_inputs_states=True)
  File "/opt/conda/lib/python3.6/site-packages/torch/jit/__init__.py", line 135, in _get_trace_graph
    outs = ONNXTracedModule(f, strict, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs)
  File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 726, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/opt/conda/lib/python3.6/site-packages/torch/jit/_trace.py", line 116, in forward
    self._force_outplace,
  File "/opt/conda/lib/python3.6/site-packages/torch/jit/_trace.py", line 105, in wrapper
    out_vars, _ = _flatten(outs)
RuntimeError: Only tuples, lists and Variables supported as JIT inputs/outputs. Dictionaries and strings are also accepted but their usage is not recommended. But got unsupported type NoneType

Thanks.

@Michaelvll
Copy link
Collaborator

Thank you for asking! We did not support the convention of the model to ONNX format. We are appreciated it if anyone would like to contribute for that. ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants