Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Magic constants in boundary tests #24

Open
Agustin-Picard opened this issue Feb 22, 2023 · 0 comments
Open

[Bug]: Magic constants in boundary tests #24

Agustin-Picard opened this issue Feb 22, 2023 · 0 comments
Assignees
Labels
bug Something isn't working

Comments

@Agustin-Picard
Copy link
Member

Module

None

Contact Details

agustin-martin.picard@irt-saintexupery.com

Current Behavior

There are constants in the tests for the boundary_based module whose origin is not clear.

Ex: the influence_computed_expected variable in the following test.

def test_compute_influence_values():
    model = Sequential()
    model.add(Input(shape=(3,)))
    model.add(Dense(2, kernel_initializer=tf.constant_initializer([[1, 1, 1], [0, 0, 0]]),
                    bias_initializer=tf.constant_initializer([4.0, 0.0])))

    calculator = WeightsBoundaryCalculator(model)

    inputs_train = tf.zeros((1, 3))
    targets_train = tf.one_hot(tf.zeros((1,), dtype=tf.int32), 2)
    train_set = tf.data.Dataset.from_tensor_slices((inputs_train, targets_train)).batch(1)

    influence_computed_score = calculator._compute_influence_values(train_set)

    # modify the bias term to get equal logits
    influence_computed_expected = tf.convert_to_tensor([[-np.sqrt(2.0) * 2.0]], dtype=tf.float32)

    assert tf.reduce_max(tf.abs(influence_computed_expected - influence_computed_score)) < 1E-6

The same happens in the test for the other boundary method.

Expected Behavior

We expect tests to be clear to maximize maintainability in the long term. This sometimes means explaining how the expected value is calculated.

Version

v0.1.0

Environment

- OS:
- Python version:
- Tensorflow version:
- Packages used version:

Relevant log output

No response

To Reproduce

N/A

@Agustin-Picard Agustin-Picard added the bug Something isn't working label Feb 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants