Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
-
Updated
Mar 26, 2024 - Python
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
An Implementation of Transformer (Attention Is All You Need) in DyNet
Code for the paper "STConvS2S: Spatiotemporal Convolutional Sequence to Sequence Network for Weather Forecasting" (Neurocomputing, Elsevier)
Encoder-Decoder model for Semantic Role Labeling
Handwriting Trajectory Recovery using End-to-End Deep Encoder-Decoder Network, ICPR 2018.
The proposed framework to retrieve the continuous chunk-level emotions via emo-rankers for Seq2Seq SER
Sequence to Sequence Transformer implementation in order to train a model to translate over Cap-verdian criole to English.
Grounded Sequence-to-Sequence Transduction Team at JSALT 2018
Established a deep learning model which can translate English words/sentences into their corresponding French translations.
This repository shows the implementation of the paper Neural Machine Translation by Jointly Learning to Align and Translate
Successfully developed an encoder-decoder based sequence to sequence (Seq2Seq) model which can summarize the entire text of an Indian news summary into a short paragraph with limited number of words.
Successfully established a neural machine translation model using sequence to sequence modeling which can successfully translate English sentences to their corresponding German translations.
A concise summary generator for product reviews built using Transformers which maintains the original semantic essence and user sentiment
Sequence to sequence encoder-decoder model with Attention for Neural Machine Translation
Add a description, image, and links to the sequence-to-sequence-models topic page so that developers can more easily learn about it.
To associate your repository with the sequence-to-sequence-models topic, visit your repo's landing page and select "manage topics."