|
1 | 1 | # PyTorch Sentiment Analysis
|
2 | 2 |
|
3 |
| -## Note: This repo only works with torchtext 0.9 or above which requires PyTorch 1.8 or above. If you are using torchtext 0.8 then please use [this](https://github.com/bentrevett/pytorch-sentiment-analysis/tree/torchtext08) branch |
4 |
| - |
5 |
| -This repo contains tutorials covering how to do sentiment analysis using [PyTorch](https://github.com/pytorch/pytorch) 1.8 and [torchtext](https://github.com/pytorch/text) 0.9 using Python 3.7. |
6 |
| - |
7 |
| -The first 2 tutorials will cover getting started with the de facto approach to sentiment analysis: recurrent neural networks (RNNs). The third notebook covers the [FastText](https://arxiv.org/abs/1607.01759) model and the final covers a [convolutional neural network](https://arxiv.org/abs/1408.5882) (CNN) model. |
8 |
| - |
9 |
| -There are also 2 bonus "appendix" notebooks. The first covers loading your own datasets with torchtext, while the second contains a brief look at the pre-trained word embeddings provided by torchtext. |
| 3 | +This repo contains tutorials covering understanding and implementing sequence classification models using [PyTorch](https://github.com/pytorch/pytorch), with Python 3.9. Specifically, we'll train models to predict sentiment from movie reviews. |
10 | 4 |
|
11 | 5 | **If you find any mistakes or disagree with any of the explanations, please do not hesitate to [submit an issue](https://github.com/bentrevett/pytorch-sentiment-analysis/issues/new). I welcome any feedback, positive or negative!**
|
12 | 6 |
|
13 | 7 | ## Getting Started
|
14 | 8 |
|
15 |
| -To install PyTorch, see installation instructions on the [PyTorch website](https://pytorch.org/get-started/locally). |
16 |
| - |
17 |
| -To install torchtext: |
18 |
| - |
19 |
| -``` bash |
20 |
| -pip install torchtext |
21 |
| -``` |
22 |
| - |
23 |
| -We'll also make use of spaCy to tokenize our data. To install spaCy, follow the instructions [here](https://spacy.io/usage/) making sure to install the English models with: |
24 |
| - |
25 |
| -``` bash |
26 |
| -python -m spacy download en_core_web_sm |
27 |
| -``` |
28 |
| - |
29 |
| -For tutorial 6, we'll use the transformers library, which can be installed via: |
30 |
| - |
31 |
| -```bash |
32 |
| -pip install transformers |
33 |
| -``` |
34 |
| - |
35 |
| -These tutorials were created using version 4.3 of the transformers library. |
| 9 | +Install the required dependencies with: `pip install -r requirements.txt --upgrade`. |
36 | 10 |
|
37 | 11 | ## Tutorials
|
38 | 12 |
|
39 |
| -* 1 - [Simple Sentiment Analysis](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/1%20-%20Simple%20Sentiment%20Analysis.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/1%20-%20Simple%20Sentiment%20Analysis.ipynb) |
| 13 | +- 1 - [Neural Bag of Words](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/1%20-%20Simple%20Sentiment%20Analysis.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/1%20-%20Simple%20Sentiment%20Analysis.ipynb) |
40 | 14 |
|
41 |
| - This tutorial covers the workflow of a PyTorch with torchtext project. We'll learn how to: load data, create train/test/validation splits, build a vocabulary, create data iterators, define a model and implement the train/evaluate/test loop. The model will be simple and achieve poor performance, but this will be improved in the subsequent tutorials. |
| 15 | + This tutorial covers the workflow of a sequence classification project with PyTorch. We'll cover the basics of sequence classification using a simple, but effective, neural bag-of-words model, and how to use the datasets/torchtext libaries to simplify data loading/preprocessing. |
42 | 16 |
|
43 |
| -* 2 - [Upgraded Sentiment Analysis](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/2%20-%20Upgraded%20Sentiment%20Analysis.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/2%20-%20Upgraded%20Sentiment%20Analysis.ipynb) |
| 17 | +- 2 - [Recurrent Neural Networks](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/2%20-%20Upgraded%20Sentiment%20Analysis.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/2%20-%20Upgraded%20Sentiment%20Analysis.ipynb) |
44 | 18 |
|
45 |
| - Now we have the basic workflow covered, this tutorial will focus on improving our results. We'll cover: using packed padded sequences, loading and using pre-trained word embeddings, different optimizers, different RNN architectures, bi-directional RNNs, multi-layer (aka deep) RNNs and regularization. |
| 19 | + Now we have the basic sequence classification workflow covered, this tutorial will focus on improving our results by switching to a recurrent neural network (RNN) model. We'll cover the theory behind RNNs, and look at an implementation of the long short-term memory (LSTM) RNN, one of the most common variants of RNN. |
46 | 20 |
|
47 |
| -* 3 - [Faster Sentiment Analysis](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/3%20-%20Faster%20Sentiment%20Analysis.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/3%20-%20Faster%20Sentiment%20Analysis.ipynb) |
48 |
| - |
49 |
| - After we've covered all the fancy upgrades to RNNs, we'll look at a different approach that does not use RNNs. More specifically, we'll implement the model from [Bag of Tricks for Efficient Text Classification](https://arxiv.org/abs/1607.01759). This simple model achieves comparable performance as the *Upgraded Sentiment Analysis*, but trains much faster. |
50 |
| - |
51 |
| -* 4 - [Convolutional Sentiment Analysis](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/4%20-%20Convolutional%20Sentiment%20Analysis.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/4%20-%20Convolutional%20Sentiment%20Analysis.ipynb) |
| 21 | +- 3 - [Convolutional Neural Networks](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/4%20-%20Convolutional%20Sentiment%20Analysis.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/4%20-%20Convolutional%20Sentiment%20Analysis.ipynb) |
52 | 22 |
|
53 | 23 | Next, we'll cover convolutional neural networks (CNNs) for sentiment analysis. This model will be an implementation of [Convolutional Neural Networks for Sentence Classification](https://arxiv.org/abs/1408.5882).
|
54 | 24 |
|
55 |
| -* 5 - [Multi-class Sentiment Analysis](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/5%20-%20Multi-class%20Sentiment%20Analysis.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/5%20-%20Multi-class%20Sentiment%20Analysis.ipynb) |
56 |
| - |
57 |
| - Then we'll cover the case where we have more than 2 classes, as is common in NLP. We'll be using the CNN model from the previous notebook and a new dataset which has 6 classes. |
58 |
| - |
59 |
| -* 6 - [Transformers for Sentiment Analysis](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/6%20-%20Transformers%20for%20Sentiment%20Analysis.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/6%20-%20Transformers%20for%20Sentiment%20Analysis.ipynb) |
60 |
| - |
61 |
| - Finally, we'll show how to use the transformers library to load a pre-trained transformer model, specifically the BERT model from [this](https://arxiv.org/abs/1810.04805) paper, and use it to provide the embeddings for text. These embeddings can be fed into any model to predict sentiment, however we use a gated recurrent unit (GRU). |
62 |
| - |
63 |
| -## Appendices |
64 |
| - |
65 |
| -* A - [Using TorchText with your Own Datasets](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/A%20-%20Using%20TorchText%20with%20Your%20Own%20Datasets.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/A%20-%20Using%20TorchText%20with%20Your%20Own%20Datasets.ipynb) |
66 |
| - |
67 |
| - The tutorials use TorchText's built in datasets. This first appendix notebook covers how to load your own datasets using TorchText. |
68 |
| - |
69 |
| -* B - [A Closer Look at Word Embeddings](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/B%20-%20A%20Closer%20Look%20at%20Word%20Embeddings.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/B%20-%20A%20Closer%20Look%20at%20Word%20Embeddings.ipynb) |
| 25 | +- 4 - [Transformers](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/6%20-%20Transformers%20for%20Sentiment%20Analysis.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/6%20-%20Transformers%20for%20Sentiment%20Analysis.ipynb) |
70 | 26 |
|
71 |
| - This appendix notebook covers a brief look at exploring the pre-trained word embeddings provided by TorchText by using them to look at similar words as well as implementing a basic spelling error corrector based entirely on word embeddings. |
| 27 | + Finally, we'll show how to use the transformers library to load a pre-trained transformer model, specifically the BERT model from [this](https://arxiv.org/abs/1810.04805) paper, and use it for sequence classification. |
72 | 28 |
|
73 |
| -* C - [Loading, Saving and Freezing Embeddings](https://github.com/bentrevett/pytorch-sentiment-analysis/blob/master/C%20-%20Loading%2C%20Saving%20and%20Freezing%20Embeddings.ipynb) [](https://colab.research.google.com/github/bentrevett/pytorch-sentiment-analysis/blob/master/C%20-%20Loading%2C%20Saving%20and%20Freezing%20Embeddings.ipynb) |
| 29 | +## Legacy Tutorials |
74 | 30 |
|
75 |
| - In this notebook we cover: how to load custom word embeddings, how to freeze and unfreeze word embeddings whilst training our models and how to save our learned embeddings so they can be used in another model. |
| 31 | +Previous versions of these tutorials used features from the torchtext library which are no longer available. These are stored in the [legacy](https://github.com/bentrevett/pytorch-sentiment-analysis/tree/main/legacy) directory. |
76 | 32 |
|
77 | 33 | ## References
|
78 | 34 |
|
79 | 35 | Here are some things I looked at while making these tutorials. Some of it may be out of date.
|
80 | 36 |
|
81 |
| -* http://anie.me/On-Torchtext/ |
82 |
| -* http://mlexplained.com/2018/02/08/a-comprehensive-tutorial-to-torchtext/ |
83 |
| -* https://github.com/spro/practical-pytorch |
84 |
| -* https://gist.github.com/Tushar-N/dfca335e370a2bc3bc79876e6270099e |
85 |
| -* https://gist.github.com/HarshTrivedi/f4e7293e941b17d19058f6fb90ab0fec |
86 |
| -* https://github.com/keras-team/keras/blob/master/examples/imdb_fasttext.py |
87 |
| -* https://github.com/Shawn1993/cnn-text-classification-pytorch |
| 37 | +- http://anie.me/On-Torchtext/ |
| 38 | +- http://mlexplained.com/2018/02/08/a-comprehensive-tutorial-to-torchtext/ |
| 39 | +- https://github.com/spro/practical-pytorch |
| 40 | +- https://gist.github.com/Tushar-N/dfca335e370a2bc3bc79876e6270099e |
| 41 | +- https://gist.github.com/HarshTrivedi/f4e7293e941b17d19058f6fb90ab0fec |
| 42 | +- https://github.com/keras-team/keras/blob/master/examples/imdb_fasttext.py |
| 43 | +- https://github.com/Shawn1993/cnn-text-classification-pytorch |
0 commit comments