Extend/Passing extra source tokens to seq2seq encoder (PyTorch)
-
Updated
Jan 4, 2018 - Python
Extend/Passing extra source tokens to seq2seq encoder (PyTorch)
Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"
Major technology trends driving Deep Learning - Andrew NG
A collection of Tensorflow implementation for different types of attention mechanism in text-related tasks.
Question Answer Bot created using Gunthercox Corpus
Fake news detection on LIAR-PLUS dataset using traditional machine learning techniques and deep learning techniques. Used a normal LSTM network and also contextual attention (with justification) for deep learning techniques.
Implementation of Attention models
mscoco-face-cap: Emotion driven Image Captioning
a bunch of code for training image captioning models with pytorch. model architectures are based on the show, attend & tell paper with different attention component implementations.
This repository is to understand Attention mechanism for the Classification task. The task used here for explanation is Recognizing Textual Entailment. It is a Natural Language Inference task.
Offered by deeplearning.ai via Coursera. The course is taught by Younes Bensouda Mourri, Łukasz Kaiser, and Eddy Shyu.
Reproducibility Challenge 2020 papers
Vision transformer trained for cats vs dogs
Accessing the Writing skills of a document/author by classifiying the statements and sentences into different classes based on the sequential learning using Pre-trained models from BERT. Rating the document based on the score obtained from the classes for each statement/sentence.
Official implementation utilised on the paper: Disagreement attention: Let us agree to disagree on computed tomography segmentation
Attention is all you need with Pytorch
Assorted pytorch review repo with code covering basics, simple regression, classification, vision and transformer models
A Comprehensive Implementation of Transformers Architecture from Scratch
Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."