Skip to content

Sequence Models repository for all projects and programming assignments of Course 5 of 5 of the Deep Learning Specialization offered on Coursera and taught by Andrew Ng, covering topics such as Recurrent Neural Network (RNN), Gated Recurrent Unit (GRU), Long Short Term Memory (LSTM), Natural Language Processing, Word Embeddings and Attention Model.

Notifications You must be signed in to change notification settings

georgezoto/RNN-LSTM-NLP-Sequence-Models

Repository files navigation

RNN-LSTM-NLP-Sequence-Models

Sequence Models repository for all projects and programming assignments of Course 5 of 5 of the Deep Learning Specialization offered on Coursera and taught by Andrew Ng, covering topics such as Recurrent Neural Network (RNN), Gated Recurrent Unit (GRU), Long Short Term Memory (LSTM), Natural Language Processing, Word Representations and Embeddings and Attention Model.

I loved implementing cool applications including Character Level Language Modeling, Text and Music generation, Sentiment Classification, Debiasing Word Embeddings, Speech Recognition and Trigger Word Detection. I had a wonderful time using the Google Cloud Platform (GCP) and Deep Learning Frameworks Keras and Tensorflow.

RNN, GRU, LSTM, NLP, Sequence-Models, see Papers

  • 2014 GRU On the Properties of Neural Machine Translation Encoder-Decoder Approaches - Cho, Merrienboer, Bahdanau, Bengio
  • 2014 GRU Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling - Chung, Gulcehre, Cho, Bengio
  • 1997 LSTM LONG SHORT-TERM MEMORY - Sepp Hochreiter, Jurgen Schmidhuber
  • 2008 t-SNE Visualizing Data using t-SNE - Laurens van der Maaten, Geoffrey Hinton
  • 2013 Linguistic Regularities in Continuous Space Word Representations - Mikolov, Yih, Zweig
  • 2003 A Neural Probabilistic Language Model - Bengio, Ducharme, Vincent, Jauvin
  • 2013 Word2Vec CBOW Skip-gram Efficient Estimation of Word Representations in Vector Space - Mikolov, Chen, Corrado, Dean
  • 2013 Word2Vec Negative Sampling Distributed Representations of Words and Phrases and their Compositionality - Mikolov, Sutskever, Chen, Corrado, Dean
  • 2014 GloVe - Global Vectors for Word Representation - Pennington, Socher, Manning
  • 2016 Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embedding - Bolukbasi, Chang, Zou, Saligrama, Kalai
  • 2014 Sequence to Sequence Learning with Neural Networks - Sutskever, Vinyals, Le
  • 2014 Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation - Cho, Merrienboer, Gulcehre, Bahdanau, Bougares, Schwenk, Bengio
  • 2015 Deep Captioning with Multimodal Recurrent Neural Networks (m-RNN) - Mao, Xu, Yang, Wang, Huang, Yuille
  • 2014 Show and Tell - A Neural Image Caption Generator - Vinyals, Toshev, Bengio, Erhan
  • 2015 Deep Visual-Semantic Alignments for Generating Image Descriptions - Karpathy, Li
  • 2002 BLEU bilingual evaluation understudy - a Method for Automatic Evaluation of Machine Translation - Papineni, Roukos, Ward, Zhu
  • 2016 Attention Model Neural Machine Translation by Jointly Learning to Align and Translate - Bahdanau, Cho, Bengio
  • 2006 CTC Connectionist Temporal Classification - Labelling Unsegmented Sequence Data with Recurrent Neural Networks - Graves, Fernandez, Gomez, Schmidhuber
  • 2014 DeepFace- Closing the Gap to Human-Level Performance in Face Verification - Taigman, Yang, Ranzato, Wolf
  • 2016 Show, Attend and Tell - Neural Image Caption Generation with Visual Attention - Xu, Ba, Kiros, Cho, Courville, Salakhutdinov, Zemel, Bengio

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

alt text

Recurrent Neural Networks and Long Short Term Memory Networks Resources

Word2Vec Tutorials and Resources

Links:

https://www.coursera.org/learn/nlp-sequence-models
https://www.coursera.org/specializations/deep-learning
https://www.deeplearning.ai
https://www.coursera.org/account/accomplishments/verify/PA3E5G7YQXNM
https://www.coursera.org/account/accomplishments/specialization/TYHX7MGWHFGT
https://www.youtube.com/watch?v=ggQ1y1UHOvc

About

Sequence Models repository for all projects and programming assignments of Course 5 of 5 of the Deep Learning Specialization offered on Coursera and taught by Andrew Ng, covering topics such as Recurrent Neural Network (RNN), Gated Recurrent Unit (GRU), Long Short Term Memory (LSTM), Natural Language Processing, Word Embeddings and Attention Model.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published