[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
-
Updated
Apr 18, 2024 - Python
[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Official Implementation of Energy Transformer in PyTorch for Mask Image Reconstruction
MetaFormer-based Global Contexts-aware Network for Efficient Semantic Segmentation (Accepted by WACV 2024)
This repository contains the official code for Energy Transformer---an efficient Energy-based Transformer variant for graph classification
Nonparametric Modern Hopfield Models
[ICLR 2022] "Unified Vision Transformer Compression" by Shixing Yu*, Tianlong Chen*, Jiayi Shen, Huan Yuan, Jianchao Tan, Sen Yang, Ji Liu, Zhangyang Wang
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
[MICCAI 2023] DAE-Former: Dual Attention-guided Efficient Transformer for Medical Image Segmentation
Gated Attention Unit (TensorFlow implementation)
[ICCV 2023] Efficient Video Action Detection with Token Dropout and Context Refinement
Implementation of the Transformer variant proposed in "Transformer Quality in Linear Time"
Demo code for CVPR2023 paper "Sparsifiner: Learning Sparse Instance-Dependent Attention for Efficient Vision Transformers"
[CVPR 2023] IMP: iterative matching and pose estimation with transformer-based recurrent module
[NeurIPS 2022 Spotlight] This is the official PyTorch implementation of "EcoFormer: Energy-Saving Attention with Linear Complexity"
Mask Transfiner for High-Quality Instance Segmentation, CVPR 2022
Official PyTorch implementation of our ECCV 2022 paper "Sliced Recursive Transformer"
Official PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Pytorch implementation of LISA (Linear-Time Self Attention with Codeword Histogram for Efficient Recommendation. WWW 2021)
Master thesis with code investigating methods for incorporating long-context reasoning in low-resource languages, without the need to pre-train from scratch. We investigated if multilingual models could inherit these properties by making it an Efficient Transformer (s.a. the Longformer architecture).
This is the source code of article how to create a chatbot in python . i.e A chatbot using the Reformer, also known as the efficient Transformer, to generate dialogues between two bots.
Add a description, image, and links to the efficient-transformers topic page so that developers can more easily learn about it.
To associate your repository with the efficient-transformers topic, visit your repo's landing page and select "manage topics."