Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Core: GradOutHooks #141

Open
chr5tphr opened this issue Jun 3, 2022 · 0 comments
Open

Core: GradOutHooks #141

chr5tphr opened this issue Jun 3, 2022 · 0 comments
Labels
core Feature/bug concerning core functionality enhancement New feature or request

Comments

@chr5tphr
Copy link
Owner

chr5tphr commented Jun 3, 2022

I am currently working on GradOutHook, which is different from the current zennit.core.Hook in that instead of overwriting the full gradient of the module, it only changes the gradient output. For Composites using zennit.core.Hook, only a single Hook can be attached at a time, because it will change the full gradient of the module. The GradOutHook can modify the output gradient multiple times, and can be used together with zennit.core.Hook. This can lead to using multiple Composites at a time. Another way to enable multiple hooks would be to let the module_map function of Composites allow to return a tuple of Hooks to be applied.

The main use case for this is to mask or re-weight neurons, mainly to support LRP for GNNs. Another use-case is to mask certain neurons to get LRP for a subset of features/concepts.

This will somewhat change the Hook-inheritance, where a HookBase will be added to specify the interface necessary for all Hooks. Also, I am considering to add a Mask rule to zennit/rule.py which takes a function or a tensor to mask the gradient output, which can be used without subclassing the planned GradOutHook.

@chr5tphr chr5tphr added enhancement New feature or request core Feature/bug concerning core functionality labels Aug 10, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
core Feature/bug concerning core functionality enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant