Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
-
Updated
May 22, 2024 - Python
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Code for the paper "STConvS2S: Spatiotemporal Convolutional Sequence to Sequence Network for Weather Forecasting" (Neurocomputing, Elsevier)
An Implementation of Transformer (Attention Is All You Need) in DyNet
Handwriting Trajectory Recovery using End-to-End Deep Encoder-Decoder Network, ICPR 2018.
Encoder-Decoder model for Semantic Role Labeling
Grounded Sequence-to-Sequence Transduction Team at JSALT 2018
Sequence to sequence encoder-decoder model with Attention for Neural Machine Translation
This repository shows the implementation of the paper Neural Machine Translation by Jointly Learning to Align and Translate
The proposed framework to retrieve the continuous chunk-level emotions via emo-rankers for Seq2Seq SER
Sequence to Sequence Transformer implementation in order to train a model to translate over Cap-verdian criole to English.
Successfully established a neural machine translation model using sequence to sequence modeling which can successfully translate English sentences to their corresponding German translations.
Established a deep learning model which can translate English words/sentences into their corresponding French translations.
A concise summary generator for Amazon product reviews built using Transformers which maintains the original semantic essence and user sentiment
Successfully developed an encoder-decoder based sequence to sequence (Seq2Seq) model which can summarize the entire text of an Indian news summary into a short paragraph with limited number of words.
Add a description, image, and links to the sequence-to-sequence-models topic page so that developers can more easily learn about it.
To associate your repository with the sequence-to-sequence-models topic, visit your repo's landing page and select "manage topics."