Train a Seq2Seq Model with Attention to Translate from One Language to Another
-
Updated
Apr 21, 2020 - Python
Train a Seq2Seq Model with Attention to Translate from One Language to Another
A deep learning model that achieves video super-resolution tasks with temporal and spatial attention in cascade
This GitHub repository houses an innovative implementation of Neural Machine Translation (NMT) using state-of-the-art sequence-to-sequence networks. The primary focus is on enhancing translation quality through progressively advanced architectural improvements.
Sequence to sequence encoder-decoder model with Attention for Neural Machine Translation
Chatbot using Transformer Model and DialoGPT
The Positional Encoder Decoder is a Visual Basic .NET class that provides functionality for encoding and decoding tokens and sentences using positional embeddings. It allows you to convert between string tokens and their corresponding embeddings, and vice versa.
Generate Images from text prompt using Stable Diffusion Model
A seq2seq model using LSTM-based encoder-decoder architecture for metadata generation from code snippets, optimizing software maintenance process
This GitHub repository contains the implementation of a deep learning model capable of generating captions for images in the form of speech.
Deep Convolutional Encoder-Decoder Architecture implemented along with max-pooling indices for pixel-wise semantic segmentation using CamVid dataset.
My implementation of autoencoders
Enhances quality of image and video with Encoder-Decoder Neural Network
Repository for code written during my independent study with Prof. Fiterau @ UMass Amherst in Fall 2020.
Trabajos desarrollados durante el cursado de la materia NLP
Simple Implementation of the Transformer Architecture
Caption Images with Machine Learning
Decoder and encoder based seqtoseq model trained on lstm and inception_v3 to caption an Image
Add a description, image, and links to the encoder-decoder-architecture topic page so that developers can more easily learn about it.
To associate your repository with the encoder-decoder-architecture topic, visit your repo's landing page and select "manage topics."