Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in the implementation of the LRP Alpha1_Beta0_Rule #1234

Open
romain-xu-darme opened this issue Jan 19, 2024 · 0 comments
Open

Error in the implementation of the LRP Alpha1_Beta0_Rule #1234

romain-xu-darme opened this issue Jan 19, 2024 · 0 comments

Comments

@romain-xu-darme
Copy link

❓ Questions and Help

I am currently attempting to use LRP to propagate relevance through a ResNet50 and I noticed that the implementation of the $LRP_{\alpha,\beta}$ rule for $\alpha=1$ and $\beta=0$ seems incorrect.
Indeed, if I look at the formula given in https://iphome.hhi.de/samek/pdf/MonXAI19.pdf, the rule is $$R_j=\sum_k \dfrac{(a_jw_{jk})^+}{\sum_j (a_jw_{jk})^+}R_k$$
However, by only removing the negative weights in the implementation of the class Alpha1_Beta0_Rule, the resulting rule in Captum seems to be
$$R_j=\sum_k \dfrac{a_j(w_{jk})^+}{\sum_j a_j(w_{jk})^+}R_k$$
In practice, and similar to the implementation proposed here, I believe that both positive AND negative weights should be split in order to compute the value $(a_jw_{jk})^+$ as $$(a_jw_{jk})^+ = a_j^+w_{jk}^+ + a_j^-w_{jk}^-$$
However, I'm not sure that the current Captum class PropagationRule really allows this sort of shenanigans as it rely mostly on forward and backward hooks.
Anyway, I would love to have the opinion of the developers on this subject. Have a nice day :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant