Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pruning ONNX inputs #717

Open
blester125 opened this issue Aug 10, 2020 · 0 comments
Open

Pruning ONNX inputs #717

blester125 opened this issue Aug 10, 2020 · 0 comments

Comments

@blester125
Copy link
Collaborator

So when you have an input tensor like lengths for a model that isn't used it often gets stripped out by onnx (or pytorch?). For example when you have a Classifier that uses the LSTM where the lengths are needed length will be needed in the inputs. If you are using something like a Conv Net classifier where the length is never used it will get stripped out. This means that if you send a lengths tensor you will get an error.

We normally decide what to send based on the model.assests file so we should be filter the inputs based on the ort.InputSession(...).get_inputs() and it should work out? In the onnx service we might need to use this method to filter it I'm not sure if the ONNX service ever checks the model.assests file

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant