You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add support for MaxPool returning indices. #3133 currently works for static sizes but fails tests when sizes are dynamic. Here are the tests that should pass after the changes xfail_sets.py:
MaxPool2dWithIndicesStaticModule_basic passes since sizes are static and others fail. However, if the shapes are supplied for the other two tests, the tests pass. The dynamic size tests faile e2e with the message mismatched size for broadcast originating from the runtime assert in the function Value torch_to_linalg::createElementwiseLinalgGeneric:
// for exact equality with the running result size.
// This is the check which protects against the undefined behavior of
// the generated linalg op in the case of iterating two operands with
// dimensions sizes that are expected to match.
if (!elideDynamicBroadcastCheck) {
auto equalToRunning =
b.create<arith::CmpIOp>(loc, arith::CmpIPredicate::eq,
resultShape[resultDim], currentDimSize);
b.create<cf::AssertOp>(loc, equalToRunning,
"mismatched size for broadcast");
}
This assert omitted when shapes are known at compile time
The text was updated successfully, but these errors were encountered:
Add support for MaxPool returning indices. #3133 currently works for static sizes but fails tests when sizes are dynamic. Here are the tests that should pass after the changes xfail_sets.py:
MaxPool2dWithIndicesStaticModule_basic
passes since sizes are static and others fail. However, if the shapes are supplied for the other two tests, the tests pass. The dynamic size tests faile e2e with the messagemismatched size for broadcast
originating from the runtime assert in the functionValue torch_to_linalg::createElementwiseLinalgGeneric
:This assert omitted when shapes are known at compile time
The text was updated successfully, but these errors were encountered: