Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MLIR][TORCH] Onnx.MaxPool fails on dynamic sizes #3135

Open
IanWood1 opened this issue Apr 10, 2024 · 0 comments
Open

[MLIR][TORCH] Onnx.MaxPool fails on dynamic sizes #3135

IanWood1 opened this issue Apr 10, 2024 · 0 comments

Comments

@IanWood1
Copy link
Contributor

IanWood1 commented Apr 10, 2024

Add support for MaxPool returning indices. #3133 currently works for static sizes but fails tests when sizes are dynamic. Here are the tests that should pass after the changes xfail_sets.py:

# Failure - onnx_lowering: onnx.MaxPool
"MaxPool2dWithIndicesAllNegativeValuesModule_basic",
"MaxPool2dWithIndicesNonDefaultPaddingModule_basic",
"MaxPool2dWithIndicesStaticModule_basic",

MaxPool2dWithIndicesStaticModule_basic passes since sizes are static and others fail. However, if the shapes are supplied for the other two tests, the tests pass. The dynamic size tests faile e2e with the message mismatched size for broadcast originating from the runtime assert in the function Value torch_to_linalg::createElementwiseLinalgGeneric:

      // for exact equality with the running result size.
      // This is the check which protects against the undefined behavior of
      // the generated linalg op in the case of iterating two operands with
      // dimensions sizes that are expected to match.
      if (!elideDynamicBroadcastCheck) {
        auto equalToRunning =
            b.create<arith::CmpIOp>(loc, arith::CmpIPredicate::eq,
                                    resultShape[resultDim], currentDimSize);
        b.create<cf::AssertOp>(loc, equalToRunning,
                               "mismatched size for broadcast");
      }

This assert omitted when shapes are known at compile time

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant