Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom rules for vision transformer #184

Open
ascdqz opened this issue May 8, 2023 · 1 comment
Open

Custom rules for vision transformer #184

ascdqz opened this issue May 8, 2023 · 1 comment
Labels
model compatibility Compatibility for new or variations of existing models

Comments

@ascdqz
Copy link

ascdqz commented May 8, 2023

Hello
I'm trying to use this method on a vision transformer model(model = torchvision.models.vit_b_16(), first several layers in below image). I read the document, And I think I need to write and use new rules?(I see that there are some new types of layers that doesn't have an existing class . And also submodules are a little complex. So I have to use new rules right?). I read the document of how to write a custom rule, but I can't think of an idea which rules use on which layer in this VIT model.( I want to get images like original epsilonplusflat.) I run the code below and got error show below. Do you have any recommendations on how to run lrp method on this model?
Thank you!

composite = EpsilonPlusFlat()
with Gradient(model=model, composite=composite) as attributor:
     output, attribution = attributor(data, target)

image
image

@chr5tphr
Copy link
Owner

Hey @ascdqz ,

we have planned support for transformers/ attention, and the new built-in torch.nn.MultiheadAttention is great for that, since we do not need to explicitly support external implementations that way.
Have a look at this previous comment, where I mention this work by @tschnake et. al. , which is the implementation approach we will be taking.

Unfortunately, my schedule is super full, so I will probably not get to work on this until maybe late summer.
If you feel up to it, you can try to have a shot at implementing the missing rules yourself, and if you have the time to also write tests and documentation, we would also be happy if you would contribute to Zennit.

But do not feel pressured, I will eventually try to find someone to do it, or do it myself once my schedule allows me to.
In the meantime, feel free to ask any questions here with respect to the implementation, and I would be happy to assist!

Repository owner deleted a comment from tschnake May 10, 2023
@chr5tphr chr5tphr added the model compatibility Compatibility for new or variations of existing models label Aug 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
model compatibility Compatibility for new or variations of existing models
Projects
None yet
Development

No branches or pull requests

2 participants