Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[onert] Support multi-batch input inference for single batch model #12859

Open
hseok-oh opened this issue Apr 11, 2024 · 2 comments
Open

[onert] Support multi-batch input inference for single batch model #12859

hseok-oh opened this issue Apr 11, 2024 · 2 comments
Assignees
Labels
area/onert ONE runtime type/issue There is something strange

Comments

@hseok-oh
Copy link
Contributor

Currently we are supporting multi-batch input inference for single batch model on shape inference routine.
This feature is used for trix backend multi-batch inference and training input.

But at first, shape inference was designed for fixed batch size with changed width and height.
It makes unexpected shape inference result on some operator (ex. reshape).

To support multi-batch input for single batch model, we need to revise shape inference or introduce new feature.

@hseok-oh hseok-oh added area/onert ONE runtime type/issue There is something strange labels Apr 11, 2024
@hseok-oh hseok-oh self-assigned this Apr 11, 2024
@hseok-oh

This comment was marked as outdated.

@hseok-oh
Copy link
Contributor Author

hseok-oh commented Apr 11, 2024

Workaround for Reshape operator: #12860

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/onert ONE runtime type/issue There is something strange
Projects
None yet
Development

No branches or pull requests

1 participant