Skip to content

Implementation for the paper Explaining Text Similarity in Transformer Models

License

Notifications You must be signed in to change notification settings

alevas/xai_similarity_transformers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

xai_similarity_transformers

The implementation and examples for the paper (Explaining Text Similarity in Transformer Models)[https://arxiv.org/abs/2405.06604], accepted to NAACL 2024.

Usage instructions

Python version: 3.11

Virtual Environment

Simply create a virtual environment in Python 3.11 and install the packages in the requirements.txt file and run a jupyter server:

python3 -m venv venv
source venv/bin/activate
python3 -m pip install -r requirements.txt
jupyter notebook

Models tested with the code

As also described in the paper, the following models can be used with the code as it is currently:

  • BERT
  • mBERT (multilingual BERT)
  • SBERT
  • GPT-Neo

LRP

The BiLRP implementation is TODO The lines relevant to these code modifications are marked with # xai_impl in both src/models/xai_bert.py and src/models/xai_gpt_neo.py modules. The implemented changes are done in the Attention heads, the LayerNorm layers and the GELU activation function (applicable for models based on BERT only).

About

Implementation for the paper Explaining Text Similarity in Transformer Models

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published