Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

checkpoint export to onnx fail #3

Open
qiudao767 opened this issue Aug 12, 2021 · 4 comments
Open

checkpoint export to onnx fail #3

qiudao767 opened this issue Aug 12, 2021 · 4 comments

Comments

@qiudao767
Copy link

I convert checkpoint export to onnx, but fail
lit_model = LitResNetTransformer.load_from_checkpoint("artifacts/model_basic_2.ckpt") # lit_model.freeze() lit_model.eval() x = torch.randn((1, 16)) lit_model.to_onnx("xxx.onnx", x)

Traceback (most recent call last): File "C:/Users/Administrator/PycharmProjects/image-to-latex/model_test.py", line 44, in <module> load_model() File "C:/Users/Administrator/PycharmProjects/image-to-latex/model_test.py", line 21, in load_model torch.onnx.export(lit_model, x, "hpocr_torch.onnx", verbose=True, input_names=input_names, File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\__init__.py", line 275, in export return utils.export(model, args, f, export_params, verbose, training, File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\utils.py", line 88, in export _export(model, args, f, export_params, verbose, training, input_names, output_names, File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\utils.py", line 689, in _export _model_to_graph(model, args, verbose, input_names, File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\utils.py", line 458, in _model_to_graph graph, params, torch_out, module = _create_jit_graph(model, args, File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\utils.py", line 422, in _create_jit_graph graph, torch_out = _trace_and_get_graph_from_model(model, args) File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\utils.py", line 373, in _trace_and_get_graph_from_model torch.jit._get_trace_graph(model, args, strict=False, _force_outplace=False, _return_inputs_states=True) File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\jit\_trace.py", line 1160, in _get_trace_graph outs = ONNXTracedModule(f, strict, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs) File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\nn\modules\module.py", line 1051, in _call_impl return forward_call(*input, **kwargs) File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\jit\_trace.py", line 127, in forward graph, out = torch._C._create_graph_by_tracing( File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\jit\_trace.py", line 118, in wrapper outs.append(self.inner(*trace_inputs)) File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\nn\modules\module.py", line 1051, in _call_impl return forward_call(*input, **kwargs) File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\nn\modules\module.py", line 1039, in _slow_forward result = self.forward(*input, **kwargs) File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\pytorch_lightning\core\lightning.py", line 529, in forward return super().forward(*args, **kwargs) File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\nn\modules\module.py", line 201, in _forward_unimplemented raise NotImplementedError

@kingyiusuen
Copy link
Owner

I don't have much experience with onxx, but it seems like you just need to add a forward method in LitResNetTransformer:

class LitResNetTransformer(LightningModule):

    def forward(self, x: Tensor):
        return self.model.predict(x)

@qiudao767
Copy link
Author

Thank you very much。
I add a forward method in LitResNetTransformer just like you said。
but fail another way

def load_model():
    lit_model = LitResNetTransformer.load_from_checkpoint("artifacts/model_basic_2.ckpt")
    lit_model.freeze()
    # lit_model.eval()
    x = torch.randn((64, 3, 7, 7))
    lit_model.to_onnx("test.onnx", x, export_params=True)

error info

C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\nn\functional.py:718: UserWarning: Named tensors and all their associated APIs are an experimental feature and subject to change. Please do not use them for anything important until they are released as stable. (Triggered internally at  ..\c10/core/TensorImpl.h:1156.)
  return torch.max_pool2d(input, kernel_size, stride, padding, dilation, ceil_mode)
C:\Users\Administrator\PycharmProjects\image-to-latex\image_to_latex\models\resnet_transformer.py:102: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if x.shape[1] == 1:
C:\Users\Administrator\PycharmProjects\image-to-latex\image_to_latex\models\positional_encoding.py:45: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  assert x.shape[1] == self.pe.shape[0]  # type: ignore
C:\Users\Administrator\PycharmProjects\image-to-latex\image_to_latex\models\positional_encoding.py:79: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  assert x.shape[2] == self.pe.shape[2]  # type: ignore
C:\Users\Administrator\PycharmProjects\image-to-latex\image_to_latex\models\resnet_transformer.py:157: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  if torch.all(has_ended):
C:\Users\Administrator\PycharmProjects\image-to-latex\image_to_latex\models\resnet_transformer.py:163: TracerWarning: Converting a tensor to a Python number might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
  j = int(eos_positions[i].item()) + 1
Traceback (most recent call last):
  File "C:/Users/Administrator/PycharmProjects/image-to-latex/model_test.py", line 43, in <module>
    load_model()
  File "C:/Users/Administrator/PycharmProjects/image-to-latex/model_test.py", line 16, in load_model
    lit_model.to_onnx("test.onnx", x, export_params=True)
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\autograd\grad_mode.py", line 28, in decorate_context
    return func(*args, **kwargs)
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\pytorch_lightning\core\lightning.py", line 1749, in to_onnx
    torch.onnx.export(self, input_sample, file_path, **kwargs)
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\__init__.py", line 275, in export
    return utils.export(model, args, f, export_params, verbose, training,
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\utils.py", line 88, in export
    _export(model, args, f, export_params, verbose, training, input_names, output_names,
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\utils.py", line 689, in _export
    _model_to_graph(model, args, verbose, input_names,
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\utils.py", line 458, in _model_to_graph
    graph, params, torch_out, module = _create_jit_graph(model, args,
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\utils.py", line 422, in _create_jit_graph
    graph, torch_out = _trace_and_get_graph_from_model(model, args)
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\onnx\utils.py", line 373, in _trace_and_get_graph_from_model
    torch.jit._get_trace_graph(model, args, strict=False, _force_outplace=False, _return_inputs_states=True)
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\jit\_trace.py", line 1160, in _get_trace_graph
    outs = ONNXTracedModule(f, strict, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs)
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\nn\modules\module.py", line 1051, in _call_impl
    return forward_call(*input, **kwargs)
  File "C:\Users\Administrator\anaconda3\envs\image-to-latex\lib\site-packages\torch\jit\_trace.py", line 127, in forward
    graph, out = torch._C._create_graph_by_tracing(
RuntimeError: 0INTERNAL ASSERT FAILED at "..\\torch\\csrc\\jit\\ir\\alias_analysis.cpp":532, please report a bug to PyTorch. We don't have an op for aten::full but it isn't a special case.  Argument types: int[], bool, NoneType, int, Device, bool, 

@qiudao767
Copy link
Author

could you give a export onnx example in this project。

https://pytorch-lightning.readthedocs.io/en/latest/common/production_inference.html

@kingyiusuen
Copy link
Owner

Based on the error message, it seems like the problem comes from L157 of models/resnet_transformer.py, since torch.all is used there. My guess is torch.jit.trace does not like conditionals on data in tensors, so maybe you can try deleting L155-158 and see if it works?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants