Extend/Passing extra source tokens to seq2seq encoder (PyTorch)
-
Updated
Jan 4, 2018 - Python
Extend/Passing extra source tokens to seq2seq encoder (PyTorch)
TensorFlow implementation of "Pointer Networks"
Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"
Major technology trends driving Deep Learning - Andrew NG
A collection of Tensorflow implementation for different types of attention mechanism in text-related tasks.
Question Answer Bot created using Gunthercox Corpus
Fake news detection on LIAR-PLUS dataset using traditional machine learning techniques and deep learning techniques. Used a normal LSTM network and also contextual attention (with justification) for deep learning techniques.
Repository containing the code to my bachelor thesis about Neural Machine Translation
We address the task of learning contextualized word, sentence and document representations with a hierarchical language model by stacking Transformer-based encoders on a sentence level and subsequently on a document level and performing masked token prediction.
mscoco-face-cap: Emotion driven Image Captioning
a bunch of code for training image captioning models with pytorch. model architectures are based on the show, attend & tell paper with different attention component implementations.
This repository is to understand Attention mechanism for the Classification task. The task used here for explanation is Recognizing Textual Entailment. It is a Natural Language Inference task.
Offered by deeplearning.ai via Coursera. The course is taught by Younes Bensouda Mourri, Łukasz Kaiser, and Eddy Shyu.
Implementation of various channel-wise attention modules
Reproducibility Challenge 2020 papers
Vision transformer trained for cats vs dogs
Attention-based video classifier running on accelerated attention approximations
Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."