Skip to content

NeuralCollapseApplications/FSCIL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Collapse Inspired Feature-Classifier Alignment for Few-Shot Class-Incremental Learning

Authors: Yibo Yang, Haobo Yuan, Xiangtai Li, Zhouchen Lin, Philip Torr, Dacheng Tao

Accepted by ICLR 2023 (top25%), Kigali, Rwanda.

[PDF] [CODE]

PWC

PWC

PWC


News: We have done a substantial extension based this work, see the new paper, and the code in this repo.

Environment

You do not need to install the environment. What you need is to start a docker container. I already put the docker image online.

DATALOC={YOUR DATA LOCATION} LOGLOC={YOUR LOG LOCATION} bash tools/docker.sh

If you want to build it by yourself (otherwise ignore it). Please run:

docker build -t harbory/openmmlab:2206 --network=host .

Data Preparation

You do not need to prepare CIFAR datasets since it is managed by torch.

For other datasets, please refer to hub(Link). It is worth noting that the Mini ImageNet dataset is with various versions. Here we follow CEC, which is widely adopted in FSCIL. Please keep in mind that the usage of datasets is governed by their corresponding agreements. Data sharing here is for research purposes only.

Please put the datasets into the {YOUR DATA LOCATION} you provided above.

Getting Start

Let's go for 🏃‍♀️running code.

[Update🙋‍♀️] We test the training scripts after the release, please refer to logs.

CIFAR

Run:

bash tools/dist_train.sh configs/cifar/resnet12_etf_bs512_200e_cifar.py 8 --work-dir /opt/logger/cifar_etf --seed 0 --deterministic && bash tools/run_fscil.sh configs/cifar/resnet12_etf_bs512_200e_cifar_eval.py /opt/logger/cifar_etf /opt/logger/cifar_etf/best.pth 8 --seed 0 --deterministic
Session 0 1 2 3 4 5 6 7 8
NC-FSCIL 82.52 76.82 73.34 69.68 66.19 62.85 60.96 59.02 56.11

[Base Log] [Incremental Log]

Mini Imagenet

Run:

bash tools/dist_train.sh configs/mini_imagenet/resnet12_etf_bs512_500e_miniimagenet.py 8 --work-dir /opt/logger/m_imagenet_etf --seed 0 --deterministic && bash tools/run_fscil.sh configs/mini_imagenet/resnet12_etf_bs512_500e_miniimagenet_eval.py /opt/logger/m_imagenet_etf /opt/logger/m_imagenet_etf/best.pth 8 --seed 0 --deterministic
Session 0 1 2 3 4 5 6 7 8
NC-FSCIL 84.02 76.80 72.00 67.83 66.35 64.04 61.46 59.54 58.31

[Base Log] [Incremental Log]

CUB

Run:

bash tools/dist_train.sh configs/cub/resnet18_etf_bs512_80e_cub.py 8 --work-dir /opt/logger/cub_etf --seed 0 --deterministic && bash tools/run_fscil.sh configs/cub/resnet18_etf_bs512_80e_cub_eval.py /opt/logger/cub_etf /opt/logger/cub_etf/best.pth 8 --seed 0 --deterministic
Session 0 1 2 3 4 5 6 7 8 9 10
NC-FSCIL 80.45 75.98 72.30 70.28 68.17 65.16 64.43 63.25 60.66 60.01 59.44

[Base Log] [Incremental Log]

Citation

If you think the code is useful in your research, please consider to refer:

@inproceedings{yang2023neural,
  title = {Neural Collapse Inspired Feature-Classifier Alignment for Few-Shot Class-Incremental Learning},
  author = {Yang, Yibo and Yuan, Haobo and Li, Xiangtai and Lin, Zhouchen and Torr, Philip and Tao, Dacheng},
  booktitle = {ICLR},
  year = {2023},
}

About

[ICLR 2023] The official code for our ICLR 2023 (top25%) paper: "Neural Collapse Inspired Feature-Classifier Alignment for Few-Shot Class-Incremental Learning"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages