You will be able to find papers we have discussed, useful links and resources and a summary table of benefits and drawbacks there are to using some of these methodologies here.
Prior to a paper presentation please add your paper in the papers
folder, naming the PDF after its title ( <title.pdf> ) so we have a copy and make sure we do not present the same paper twice.
Don't forget to check the Wiki for videos and references.
Date | Title of paper | Source |
---|---|---|
2019-01-09 | Enriching Word Vectors with Subword Information | aclweb |
2019-01-23 | Word Mover’s Embedding: From Word2Vec to Document Embedding | aclweb |
2019-02-06 | Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank | Stanford University |
2019-02-20 | Deep contextualized word representations | arxiv |
2019-03-06 | Efficient Estimation of Word Representations in Vector Space | arxiv |
2019-03-20 | Distributed Representations of Words and Phrases and their Compositionality | arxiv |
2019-06-26 | Distributed Representations of Sentences and Documents | Stanford University |
2019-07-24 | Latent Dirichlet Allocation | Stanford |
2019-07-31 | Mixing Dirichlet Topic Models and Word Embeddings to Make lda2vec | arxiv |
2019-09-18 | GloVe: Global Vectors for Word Representation | Stanford University |
2019-10-03 | Algorithms for Non-negative Matrix Factorization | paper |
2019-10-23 | Effective Approaches to Attention-based Neural Machine Translation | arxiv |
2019-11-21 | Attention Is All You Need | arxiv |
2020-02-12 | Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks | arxiv |
2020-02-26 | XLNet: Generalized Autoregressive Pretraining for Language Understanding | arxiv |
- CNN Understanding Convolutional Neural Networks for Text Classification (CNN paper; source: aclweb)
- LSTM An LSTM Approach to Short Text Sentiment Classification with Word Embeddings (LSTM paper; source: aclweb)
- eLmo (revisit) Deep contextualized word representations (ELMo paper; source: arxiv)
- Bert BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Bert paper; source: arxiv)
- Severn
- On the dimensionality of Word Embeddings (how many dimensions are required?)
- Visualizing Data using t-SNE (source: Journal Machine Learning Research)
- Google Universal Sentence Embedding (Universal Sentence encoder; source: Google)