An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
-
Updated
May 9, 2024
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Fast inference engine for Transformer models
Flops counter for convolutional networks in pytorch framework
[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang
solo-learn: a library of self-supervised methods for visual representation learning powered by Pytorch Lightning
Orchestrate Swarms of Agents From Any Framework Like OpenAI, Langchain, and Etc for Real World Workflow Automation. Join our Community: https://discord.gg/DbjBMJTSWD
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
A curated list of foundation models for vision and language tasks
💭 Aspect-Based-Sentiment-Analysis: Transformer & Explainable ML (TensorFlow)
[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
[BMVC 2022] You Only Need 90K Parameters to Adapt Light: A Light Weight Transformer for Image Enhancement and Exposure Correction. SOTA for low light enhancement, 0.004 seconds try this for pre-processing.
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
How to use our public wav2vec2 dimensional emotion model
Efficient Inference of Transformer models
The official code repo of "HTS-AT: A Hierarchical Token-Semantic Audio Transformer for Sound Classification and Detection"
Based on the Pytorch-Transformers library by HuggingFace. To be used as a starting point for employing Transformer models in text classification tasks. Contains code to easily train BERT, XLNet, RoBERTa, and XLM models for text classification.
Faster alternative to Metal Performance Shaders
MinT: Minimal Transformer Library and Tutorials
Punctuation Restoration using Transformer Models for High-and Low-Resource Languages
Tk-Instruct is a Transformer model that is tuned to solve many NLP tasks by following instructions.
Add a description, image, and links to the transformer-models topic page so that developers can more easily learn about it.
To associate your repository with the transformer-models topic, visit your repo's landing page and select "manage topics."