Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistent batching for DiscretizedIntegratedGradients attributions #113

Open
gsarti opened this issue Dec 1, 2021 · 1 comment 路 May be fixed by #114
Open

Inconsistent batching for DiscretizedIntegratedGradients attributions #113

gsarti opened this issue Dec 1, 2021 · 1 comment 路 May be fixed by #114
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@gsarti
Copy link
Member

gsarti commented Dec 1, 2021

馃悰 Bug Report

Despite fixing batched attribution so that results are consistent with individual attribution (see #110), the method DiscretizedIntegratedGradients still produces different results when applied to a batch of examples.

馃敩 How To Reproduce

  1. Instantiate a AttributionModel with the discretized_integrated_gradients method.
  2. Perform an attribution for a batch of examples
  3. Perform an attribution for a single example present in the previous batch
  4. Compare the attributions obtained in the two cases

Code sample

import inseq

model = inseq.load_model("Helsinki-NLP/opus-mt-en-de", "discretized_integrated_gradients")

out_multi = model.attribute(
    [
        "This aspect is very important",
        "Why does it work after the first?",
        "This thing smells",
        "Colorless green ideas sleep furiously"
    ],
    n_steps=20,
    return_convergence_delta=True,
)

out_single = model.attribute(
    [ "Why does it work after the first?" ],
    n_steps=20,
    return_convergence_delta=True,
)

assert out_single.attributions == out_multi[1].attributions # raises AssertionError

Environment

  • OS: 20.04
  • Python version: 3.8

馃搱 Expected behavior

Same as #110

馃搸 Additional context

The problem is most likely due to a faulty scaling of the gradients in the _attribute method of the DiscretizedIntegratedGradients class.

@gsarti gsarti added bug Something isn't working help wanted Extra attention is needed labels Dec 1, 2021
@gsarti gsarti added this to the v1.0 milestone Dec 1, 2021
@gsarti gsarti linked a pull request Dec 2, 2021 that will close this issue
@gsarti gsarti added the good first issue Good for newcomers label Dec 13, 2022
@gsarti
Copy link
Member Author

gsarti commented Dec 14, 2022

Hi @soumyasanyal, FYI our library supports your method Discretized IG for feature attribution, but at the moment we are experiencing some issues with consistency across single-example and batched attribution (i.e. there is some issue with the creation of orthogonal approximation steps for a batch, see also #114 for additional info). It would be great if you could have a look!

@gsarti gsarti removed this from the Demo Paper Release milestone May 8, 2023
@gsarti gsarti removed the good first issue Good for newcomers label Jul 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant