Skip to content

val-iisc/DeiT-LT

Repository files navigation

DeiT-LT: Distillation strikes back for Vision Transformer training on Long-Tailed datasets

This repository contains the training, evaluation codes and checkpoints for the paper DeiT-LT: Distillation strikes back for Vision Transformer training on Long-Tailed datasets accepted at CVPR 2024.

PWC PWC

DeiT-LT Teaser

Usage

  1. Clone the respository.
git clone https://github.com/val-iisc/DeiT-LT.git
  1. CIFAR-10 and CIFAR-100 datasets are downloaded on its own through the code. However, in case of Imagenet-LT and iNaturalist-2018, download these datasets from the given links.

  2. Create the conda environment and activate it.

conda env create -f environment.yml
conda activate deitlt
  1. Download the teacher models as given in Results table.

  2. Run the corresponding training script for any dataset after adding teacher path in the script file for the argument --teacher-path. For example:

bash sh/train_c10_if100.sh
bash sh/train_imagenetlt.sh
  1. For evaluation a DeiT-LT checkpoint, run the eval bash script for the corresponding dataset. Ensure that the checkpoint path is provided to the script for the argument --resume. For example:
bash sh/eval_c10.sh
bash sh/eval_imagenetlt.sh

Results

Dataset Imbalance Factor Overall Head Mid Tail Teacher path Student path
CIFAR 10-LT 100 87.5 94.5 84.1 85.0 Link Link
50 89.8 94.9 87.0 88.6 Link Link
CIFAR 100-LT 100 55.6 72.8 55.4 31.4 Link Link
50 60.5 74.8 60.3 43.1 Link Link
ImageNet-LT - 59.1 66.6 58.3 40.0 Link Link
iNaturalist-2018 - 75.1 70.3 75.2 76.2 Link Link

Acknowledgement

This codebase is heavily inspired from DeiT (ICML 2021). The concepts and methodologies adopted from DeiT have been instrumental in enabling us to push the boundaries of our research and development. We extend our sincerest thanks to the developers and contributors of DeiT.

License

DeiT-LT is an open-source project released under the MIT license (MIT). The codebase is derived from that of DeiT (ICML 2021), which is released under Apache 2.0 license.

BibTex

If you find this code or idea useful, please consider citing our work:

@article{rangwani2024deit-lt,
  author    = {Rangwani, Harsh and Mondal, Pradipto and Mishra, Mayank and Ramayee Asokan, Ashish and Babu, R Venkatesh},
  title     = {DeiT-LT: Distillation Strikes Back for Vision Transformer Training on Long-Tailed Datasets},
  journal   = {CVPR},
  year      = {2024},
}

About

[CVPR 2024] Code for our Paper "DeiT-LT: Distillation Strikes Back for Vision Transformer training on Long-Tailed Datasets"

Topics

Resources

License

MIT, Apache-2.0 licenses found

Licenses found

MIT
LICENSE
Apache-2.0
LICENSE-FACEBOOK

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published