A PyTorch implementation of a transformer network trained using back-translation
-
Updated
May 13, 2019 - Python
A PyTorch implementation of a transformer network trained using back-translation
Implementation of Basic Conversational Agent(a.k.a Chatbot) using PyTorch Transformer Module
Codes and write-up for Red Dragon AI Advanced NLP Course.
This repository contains my research work on building the state of the art next basket recommendations using techniques such as Autoencoders, TF-IDF, Attention based BI-LSTM and Transformer Networks
list of efficient attention modules
The objective of the project is to generate a abstractive summary from a bigger article. The process includes all the preprocessing step and summarizing the whole article. This will be very helpful to get the important context of bigger article.
Using Bayesian optimization via Ax platform + SAASBO model to simultaneously optimize 23 hyperparameters in 100 iterations (set a new Matbench benchmark).
Implementation of Transformer, BERT and GPT models in both Tensorflow 2.0 and PyTorch.
Implementation of Transformer Pointer-Critic Deep Reinforcement Learning Algorithm
[TPAMI 2023 ESI Highly Cited Paper] SePiCo: Semantic-Guided Pixel Contrast for Domain Adaptive Semantic Segmentation https://arxiv.org/abs/2204.08808
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Add a description, image, and links to the transformer-network topic page so that developers can more easily learn about it.
To associate your repository with the transformer-network topic, visit your repo's landing page and select "manage topics."