Skip to content

lilyht/CIL-MGRB

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-Granularity Regularized Re-Balancing for Class Incremental Learning

Paper: https://ieeexplore.ieee.org/document/9815145

Brief Introduction

Data imbalance between old and new classes is a key issue that leads to performance degradation of the model in incremental learning. In this study, we propose an assumption-agnostic method, Multi-Granularity Regularized re-Balancing (MGRB), to address this problem. Re-balancing methods are used to alleviate the influence of data imbalance; however, we empirically discover that they would under-fit new classes. To this end, we further design a novel multi-granularity regularization term that enables the model to consider the correlations of classes in addition to re-balancing the data. A class hierarchy is first constructed by ontology, grouping semantically or visually similar classes. The multi-granularity regularization then transforms the one-hot label vector into a continuous label distribution, which reflects the relations between the target class and other classes based on the constructed class hierarchy. Thus, the model can learn the inter-class relational information, which helps enhance the learning of both old and new classes. Experimental results on both public datasets and a real-world fault diagnosis dataset verify the effectiveness of the proposed method.

model_v9

The figure on the left shows the structure of the baseline method, and the figure on the right is the overview of our MGRB method. The proposed model has two significant parts compared with the baseline: (1) re-balancing modeling. We use the re-balancing strategies during training to alleviate the influence of data imbalance; and (2) multi-granularity regularization. A multi-granularity regularization term is designed to make the model consider class correlations. Through end-to-end learning, both old and new classes can be better learned.

Environment

  • Python 3.7
  • Pytorch 1.8.1
  • CUDA 11.2

Requirement

  • See requirements.txt for environment.
  • The pre-trained word vector library can be found here. You can download it or download the library used in this paper here.

Results

The experimental results on the CIFAR100 are as follows:

Methods CIFAR100-10/10 CIFAR100-20/20 CIFAR100-50/5 CIFAR100-50/10
lastAvg acc lastAvg acc lastAvg acc lastAvg acc
LwF28.1849.3339.4957.4933.7546.3039.2852.19
iCaRL45.7259.9650.3363.2343.9752.9346.7157.06
LUCIR41.8656.5750.1562.6450.1260.3750.5561.95
Mnemonics42.4258.5348.9563.1850.7960.4353.5863.12
BiC47.3057.5640.9048.3341.6354.1150.8858.35
Ours-CNN(ont)45.8161.0256.0366.4552.5961.5058.2065.45
Ours-CNN(vis)47.7262.3857.0068.2752.6961.5158.6365.69
PODNet39.5054.1450.1062.9453.7062.3555.3063.75
Ours-PODNet40.7054.6150.7063.4453.9062.8555.1064.32
AANets-iCaRL45.3261.4051.7565.4047.9159.7950.2461.71
Ours-AANets45.6363.0249.4265.7247.3359.9350.2662.19
DER57.2565.3162.6670.5166.6372.8166.6273.07
Ours-DER58.0365.4262.9470.7466.6173.0666.8273.27

File organization

The demo folder provides a demo version of the proposed multi-granularity regular term, which can be easily added to PODNet, AANets, and DER.

The code folder provides the complete code for MGRB.

├── data                    # dataset and nodepathinfo
├── demo
    ├── save                # data preprocessing file
    ├── parser.py           # add parameters
    ├── loss.py             # return loss
    ├── cal_MG.py           # calculate the multi-granularity regularization term
    ├── util.py             
    └── ...
├── code
    ├── save                # data preprocessing file
    ├── divide___.py        # data split
    ├── Load___.py          # data loader
    ├── etrain.py           # train and test
    ├── model.py            # network
    ├── cal_MG.py           # calculate the multi-granularity regularization term
    ├── cal_RB.py           # calculate the weight for class re-balancing loss
    └── util.py

Citation

If you use this paper/code in your research, please consider citing us:

H. Chen, Y. Wang and Q. Hu, "Multi-Granularity Regularized Re-Balancing for Class Incremental Learning," in IEEE Transactions on Knowledge and Data Engineering, 2022, doi: 10.1109/TKDE.2022.3188335.
@ARTICLE{9815145,
  author={Chen, Huitong and Wang, Yu and Hu, Qinghua},
  journal={IEEE Transactions on Knowledge and Data Engineering}, 
  title={Multi-Granularity Regularized Re-Balancing for Class Incremental Learning}, 
  year={2022},
  volume={},
  number={},
  pages={1-15},
  doi={10.1109/TKDE.2022.3188335}}

About

PyTorch implementation of MGRB (TKDE2022)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages