An implementation of (something like) the LongNet dilated attention paper
-
Updated
Sep 18, 2023 - Python
An implementation of (something like) the LongNet dilated attention paper
seq2seq model enhanced with attention mechanism
Pytorch implementation of Hierarchical Attention-Based Recurrent Highway Networks for Time Series Prediction https://arxiv.org/abs/1806.00685
Experiments in meta-learning visual attention in convolutional neural networks.
The Pytorch reimplementation of ViT (Vision Transformer)
nlp intro project
An implementation of Deepmind's Alphafold2 for 3D protein structure prediction.
A list of terms that will highlight terms that require abouslute attention on Prolific surveys.
Project 3 of Term 1 in the Udacity Self Driving Car Nanodegree
An unofficial Torch implementation of J. Lu, C. Xiong, et al., Knowing when to Look: Adaptive Attention via a Visual Sentinel for Image Captioning, 2017 with deformable adaptive attention
A Baby Llama model
NER for Chinese electronic medical records. Use doc2vec, self_attention and multi_attention.
Simple from-scratch implementations of transformer-based models that match the state of the art.
Two staged approach to predict joint level damage from hand and feet radiographs
Repository for personal experiments
A feeble attempt at understanding how GPTs work and attempting to explain it to myself and others who might be interested.
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."