Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
-
Updated
May 3, 2024 - Jupyter Notebook
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Minimalist NMT for educational purposes
Repository containing the code to my bachelor thesis about Neural Machine Translation
Neural Machine Translation using LSTMs and Attention mechanism. Two approaches were implemented, models, one without out attention using repeat vector, and the other using encoder decoder architecture and attention mechanism.
한글을 영어로 번역하는 자연어처리 모델 스터디입니다.
French to English neural machine translation trained on multi30k dataset.
Implementation of Selected Published Papers from AI, RL, NLP Conferences and reputed Journals
Interpretation for english autoencoder (seq2seq model).
REST API for training and prediction of seq2seq model
💬 Sequence to Sequence from Scratch Using Pytorch
The sequence-to sequence model implemented by pytorch
A PyTorch implementation of the hierarchical encoder-decoder architecture (HRED) introduced in Sordoni et al (2015). It is a hierarchical encoder-decoder architecture for modeling conversation triples in the MovieTriples dataset. This version of the model is built for the MovieTriples dataset.
ICLR_2018_Reproducibility_Challenge : Sketch-RNN
Paper Implementation about Attention Mechanism in Neural Network
Add a description, image, and links to the seq2seq-pytorch topic page so that developers can more easily learn about it.
To associate your repository with the seq2seq-pytorch topic, visit your repo's landing page and select "manage topics."