Skip to content

Word-in-Context (WiC) as a binary classification task using static word embeddings (Word2Vec, GloVe).

License

Notifications You must be signed in to change notification settings

LeonardoEmili/Word-in-Context

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Word-in-Context disambiguation

Open In Collab

Word-in-Context (WiC) disambiguation as a binary classification task using static word embeddings (i.e. Word2Vec and GloVe) to determine whether words in different contexts have the same meaning.

Implementation details

We propose a Bi-LSTM architecture with pre-trained word embeddings and test it against a simpler feed-forward neural network. For further insights, read the dedicated report or the presentation slides (pages 2-6).

Get the dataset

You may download the original dataset from here.

Test the model

For ready-to-go usage, simply run the notebook on Colab. In case you would like to test it on your local machine, please follow the installation guide. You may find the pre-trained models in model/pretrained.

About

Word-in-Context (WiC) as a binary classification task using static word embeddings (Word2Vec, GloVe).

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published