Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model not compatible with TorchScript conversion (via torch.jit.script) #15

Open
brunoabbate opened this issue Sep 30, 2021 · 4 comments

Comments

@brunoabbate
Copy link

It is not possible to convert the model to TorchScript using the function torch.jit.script. In particular, the code returns an error because of the usage of ... in the line:

def _setup_activation(self, input_shape: Tuple[float, ...]) -> None:

Even changing the type-hint definition to overcome this problem, the conversion is not possible because the attribute activation is initialized as None and then filled with a Tensor.

@Atze00
Copy link
Owner

Atze00 commented Sep 30, 2021

Unfortunately I have no plans to support TorchScript at the moment. I'm also not convinced that's the only problem, for example the computation of same padding could raise an error. If that's the case I suspect it would also be necessary to train the models from scratch in pytorch.

@SMHendryx
Copy link

SMHendryx commented Sep 30, 2021

The type hint issue was easy to get around but even more problematic is the use of einops which appears to be noncompat with torchscript

from torch import nn
import torch
from einops import rearrange
class Foo(nn.Module):
    def __init__(self):
        super(Foo, self).__init__()
    def forward(self, x):
        return rearrange(x, 'a b -> b a')
torch.jit.script(Foo())
...
NotSupportedError: Compiled functions can't take variable number of arguments or use keyword-only arguments with defaults:
  File "/Users/sean/standard/.env/lib/python3.6/site-packages/einops/einops.py", line 393
def rearrange(tensor, pattern: str, **axes_lengths):
                                     ~~~~~~~~~~~~~ <--- HERE

It looks like tracing may be compatible, though is more restrictive, per the issue on einops arogozhnikov/einops#115

foo = Foo()
traced_foo = torch.jit.trace(foo, torch.rand(3, 3))

@apurva1526
Copy link

Is there any solution to this problem?

@Subalzero
Copy link

Subalzero commented Mar 22, 2024

I have a fork that modifies the models so that it can be exported to TorchScript or ONNX.
https://github.com/Subalzero/MoViNet-pytorch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants