An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
-
Updated
May 22, 2024
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
Flops counter for convolutional networks in pytorch framework
Fast inference engine for Transformer models
[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang
solo-learn: a library of self-supervised methods for visual representation learning powered by Pytorch Lightning
[VLDB'22] Anomaly Detection using Transformers, self-conditioning and adversarial training.
Based on the Pytorch-Transformers library by HuggingFace. To be used as a starting point for employing Transformer models in text classification tasks. Contains code to easily train BERT, XLNet, RoBERTa, and XLM models for text classification.
Orchestrate Swarms of Agents From Any Framework Like OpenAI, Langchain, and Etc for Business Operation Automation. Join our Community: https://discord.gg/DbjBMJTSWD
💭 Aspect-Based-Sentiment-Analysis: Transformer & Explainable ML (TensorFlow)
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
Punctuation Restoration using Transformer Models for High-and Low-Resource Languages
The official code repo of "HTS-AT: A Hierarchical Token-Semantic Audio Transformer for Sound Classification and Detection"
Models to perform neural summarization (extractive and abstractive) using machine learning transformers and a tool to convert abstractive summarization datasets to the extractive task.
How to use our public wav2vec2 dimensional emotion model
[BMVC 2022] You Only Need 90K Parameters to Adapt Light: A Light Weight Transformer for Image Enhancement and Exposure Correction. SOTA for low light enhancement, 0.004 seconds try this for pre-processing.
The official code repo for "Zero-shot Audio Source Separation through Query-based Learning from Weakly-labeled Data", in AAAI 2022
Efficient Inference of Transformer models
A curated list of foundation models for vision and language tasks
Tk-Instruct is a Transformer model that is tuned to solve many NLP tasks by following instructions.
Stock price prediction using a Temporal Fusion Transformer
Add a description, image, and links to the transformer-models topic page so that developers can more easily learn about it.
To associate your repository with the transformer-models topic, visit your repo's landing page and select "manage topics."