Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't export ONNX transformer #74

Open
hyperfraise opened this issue Sep 16, 2022 · 3 comments
Open

Can't export ONNX transformer #74

hyperfraise opened this issue Sep 16, 2022 · 3 comments

Comments

@hyperfraise
Copy link

This command python3 tools/deployment/pytorch2onnx.py configs/recognition/swin/swin_tiny_patch244_window877_kinetics400_1k.py swin_tiny_patch244_window877_kinetics400_1k.pth outputs this error :

/opt/conda/lib/python3.8/site-packages/torch/functional.py:478: UserWarning: torch.meshgrid: in an upcoming release, it will be required to pass the indexing argument. (Triggered internally at /opt/pytorch/pytorch/aten/src/ATen/native/TensorShape.cpp:2966.)
  return _VF.meshgrid(tensors, **kwargs)  # type: ignore[attr-defined]
Use load_from_local loader
Traceback (most recent call last):
  File "tools/deployment/pytorch2onnx.py", line 163, in <module>
    pytorch2onnx(
  File "tools/deployment/pytorch2onnx.py", line 67, in pytorch2onnx
    torch.onnx.export(
  File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 479, in export
    _export(
  File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 1411, in _export
    graph, params_dict, torch_out = _model_to_graph(
  File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 1050, in _model_to_graph
    graph, params, torch_out, module = _create_jit_graph(model, args)
  File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 925, in _create_jit_graph
    graph, torch_out = _trace_and_get_graph_from_model(model, args)
  File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 833, in _trace_and_get_graph_from_model
    trace_graph, torch_out, inputs_states = torch.jit._get_trace_graph(
  File "/opt/conda/lib/python3.8/site-packages/torch/jit/_trace.py", line 1175, in _get_trace_graph
    outs = ONNXTracedModule(f, strict, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs)
  File "/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1186, in _call_impl
    return forward_call(*input, **kwargs)
  File "/opt/conda/lib/python3.8/site-packages/torch/jit/_trace.py", line 127, in forward
    graph, out = torch._C._create_graph_by_tracing(
  File "/opt/conda/lib/python3.8/site-packages/torch/jit/_trace.py", line 118, in wrapper
    outs.append(self.inner(*trace_inputs))
  File "/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1186, in _call_impl
    return forward_call(*input, **kwargs)
  File "/opt/conda/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1174, in _slow_forward
    result = self.forward(*input, **kwargs)
TypeError: forward_dummy() got multiple values for argument 'softmax'
@leonid-pishchulin
Copy link

I was able to resolve this issue by modifying this line as model.forward = partial(model.forward_dummy)
Then, I faced another issue that I resolved by commenting out this line. Then, I got the whole bunch of warnings line that

TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!

and finally an error: Floating point exception (core dumped). Not sure what to do at that point

@gigasurgeon
Copy link

I was able to resolve this issue by modifying this line as model.forward = partial(model.forward_dummy) Then, I faced another issue that I resolved by commenting out this line. Then, I got the whole bunch of warnings line that

TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!

and finally an error: Floating point exception (core dumped). Not sure what to do at that point

I am also getting the same Floating point exception (core dumped) error. Were you able to resolve it somehow?

@gigasurgeon
Copy link

I was able to successfully convert it to ONNX by using this -> #89 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants