Summaries and notes on Deep Learning research papers in natural language processing(NLP) domain.
-
Updated
Dec 3, 2017
Summaries and notes on Deep Learning research papers in natural language processing(NLP) domain.
TensorFlow implementation of Match-LSTM and Answer pointer for the popular SQuAD dataset.
Fully batched seq2seq example based on practical-pytorch, and more extra features.
Gathers Tensorflow deep learning models.
Vietnamese and Chinese to English
Basic seq2seq model including simplest encoder & decoder and attention-based ones
An Image Caption Generation based search
Convolution Sequence to Sequence models for Hand Written Text Recognition
Tensorflow2.0 implementation of neural machine translation with Bahdanau attention
A simple attention deep learning model to answer questions about a given video with the most relevant video intervals as answers.
Text Summarizer implemented in PyTorch
Analysis of 'Attention is not Explanation' performed for the University of Amsterdam's Fairness, Accountability, Confidentiality and Transparency in AI Course Assignment, January 2020
Generates summary of a given news article. Used attention seq2seq encoder decoder model.
Image to LaTeX (Seq2seq + Attention with Beam Search) - Tensorflow
English-Hindi translation with attention. WIP
Three different implementations for neural machine translation
Neural Machine Translation with Keras
A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
Attention-based end-to-end ASR on TIMIT in PyTorch
Add a description, image, and links to the attention-seq2seq topic page so that developers can more easily learn about it.
To associate your repository with the attention-seq2seq topic, visit your repo's landing page and select "manage topics."