You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently we are supporting multi-batch input inference for single batch model on shape inference routine.
This feature is used for trix backend multi-batch inference and training input.
But at first, shape inference was designed for fixed batch size with changed width and height.
It makes unexpected shape inference result on some operator (ex. reshape).
To support multi-batch input for single batch model, we need to revise shape inference or introduce new feature.
The text was updated successfully, but these errors were encountered:
Currently we are supporting multi-batch input inference for single batch model on shape inference routine.
This feature is used for trix backend multi-batch inference and training input.
But at first, shape inference was designed for fixed batch size with changed width and height.
It makes unexpected shape inference result on some operator (ex. reshape).
To support multi-batch input for single batch model, we need to revise shape inference or introduce new feature.
The text was updated successfully, but these errors were encountered: