Skip to content

analyzing patch importance and visualizing attention flow in vision transformers using attention scores and attention rollout

License

Notifications You must be signed in to change notification settings

arnavsm/vit-patch-importance

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Measuring Patch Importance in ViT's (Vanilla & Attention Rollout)

Developed methods to analyze patch importance in Vision Transformers (ViTs) by leveraging attention scores of the [CLS] token across attention matrices in multi-head self-attention (MHSA) blocks. Visualized the distribution of top-k patch tokens with respect to the [CLS] token, highlighting critical regions contributing to model predictions.

Additionally, implemented Attention Rollout to recursively propagate attention scores across layers, producing interpretable visualizations of information flow in self-attention mechanisms. This approach enhances understanding of attention-based models and their decision-making processes.

About

analyzing patch importance and visualizing attention flow in vision transformers using attention scores and attention rollout

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published