New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MNT Fix bug: FloatTransformer output dimension was set incorrectly. #1333
MNT Fix bug: FloatTransformer output dimension was set incorrectly. #1333
Conversation
…n_features_in_ and n_features_out_
955fc5c
to
b952ae9
Compare
Many thanks, @SeanMcCarren! This looks good to me - but it seems the tests are failing unrelated to your PR (see #1334). After we fix that and the PR passes all checks I think it should be ready for merging. |
else: | ||
if X.ndim != self.input_dim_: | ||
raise ValueError("Dimension of data is inconsistent with previous call") | ||
if X.ndim == 1: | ||
X = X.reshape(-1, 1) | ||
assert X.ndim == 2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
An if check with raising an exception with a nice error message is probably preferable. Unless this is a case that can't happen at all?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It really shouldn't happen, but I agree!
Please consider merging and releasing this fix. Thanks! |
@fairlearn/fairlearn-maintainers |
@@ -103,11 +103,17 @@ def _check(self, X, dtype=None, init=False): | |||
) | |||
if init: | |||
self.input_dim_ = X.ndim | |||
if self.input_dim_ > 2: | |||
raise ValueError( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the delay in reviewing, but is it possible to trigger this error in a test? Same for the SystemError
below?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Likewise sorry for the delay, I missed this comment as well.
I could. But it's quite trivial don't you say? And I don't see the point - even for regression testing if somehow this is deleted it's clear everything breaks if ndim > 2.
For the SystemError
: I think the only way this could happen is with ndim=0
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can the code from the linked issue be made into a test that runs in a reasonable time with the existing conda setup?
I am surprised this isn't merged yet, I am completely out of the loop of this. I can confirm that the issue raised here by parasurama will also be fixed by this PR. |
Description
Thanks to this elaborately typed out issue we found a bug in Adversarial Debiasing. Considering how big, highly parametrized, and not fully covered this implementation is, I'm surprised this is the first bug :)
The bug is simple. The
FloatTransformer
class's intention is to cast different types of input to a 2D array. But, in a specific case where no transformer is provided as input, and the output is not bi/multinomial, we accidentally set the output shape as the input shape.Tests
Documentation