Skip to content

WSD for Word-in-Context (WiC) disambiguation, experimenting with BERT feature-based and fine-tuning approaches (GlossBERT)

Notifications You must be signed in to change notification settings

andrea-gasparini/nlp-word-sense-disambiguation-wic-data

Repository files navigation

WSD for Word-in-Context disambiguation

In this project we make use of Word Sense Disambiguation (Navigli, 2009) to tackle the Word-in-Context (WiC) disambiguation task, proposing two BERT-based models. A first one with a feature-based approach and a second one with a fine-tuning approach, in which we re-implement GlossBERT (Huang et al., 2019).

For further information, you can read the detailed report or take a look at the presentation slides (pages 19-24).

This project has been developed during the A.Y. 2020-2021 for the Natural Language Processing course @ Sapienza University of Rome.

Checkpoints

Related projects

  • Word-in-Context disambiguation as a binary classification task, experimenting with a word-level approach (MLP + ReLU) and a sequence encoding one (LSTMs), on top of GloVe embeddings
  • Aspect-Based Sentiment Analysis (ABSA) using different setups based on 2 stacked BiLSTMs and Attention layers; leveraging PoS, GloVe and BERT (frozen) embeddings

Authors