Skip to content

cure-lab/deep-active-learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep Active Learning with Pytorch

An implementation of the state-of-the-art Deep Active Learning algorithm. This code was built based on Jordan Ash's repository.

Dependencies

To run this code fully, you'll need PyTorch (we're using version 1.4.0), scikit-learn. We've been running our code in Python 3.7.

Algorithms Implemented

Deep active learning Strategies

Sampling Strategies Year Done
Random Sampling x
ClusterMargin [1] arXiv'21
WAAL [2] AISTATS'20
BADGE [3] ICLR'20
Adversarial Sampling for Active Learning [4] WACV'20
Learning Loss for Active Learning [5] CVPR'19
Variational Adversial Active Learning [6] ICCV'19
BatchBALD [7] NIPS'19
K-Means Sampling [8] ICLR'18
K-Centers Greedy [8] ICLR'18
Core-Set [8] ICLR'18
Adversarial - DeepFool [9] ArXiv'18
Uncertainty Ensembles [10] NIPS'17
Uncertainty Sampling with Dropout Estimation [11] ICML'17
Bayesian Active Learning Disagreement [11] ICML'17
Least Confidence [12] IJCNN'14
Margin Sampling [12] IJCNN'14
Entropy Sampling [12] IJCNN'14
UncertainGCN Sampling [13] CVPR'21
CoreGCN Sampling [13] CVPR'21
Ensemble [14] CVPR'18
MCDAL [15] Knowledge-based Systems'19

Deep active learning + Semi-supervised learning

Sampling Strategies Year Done
Consistency-SSLAL [16] ECCV'20
MixMatch-SSLAL [17] arXiv
UDA [18] NIPS'20 In progress

Running an experiment

Requirements

First, please make sure you have installed Conda. Then, our environment can be installed by:

conda create -n DAL python=3.7
conda activate DAL
pip install -r requirements.txt

Example

python main.py --model ResNet18  --dataset cifar10 --strategy LeastConfidence

It runs an active learning experiment using ResNet18 and CIFAR-10 data, querying according to the LeastConfidence algorithm. The result will be saved in the ./save directory.

You can also use run.sh to run experiments.

Self-supervised feautres of data

You can download the features/feature_model from here

Contact

If you have any questions/suggestions, or would like to contribute to this repo, please feel free to contact: Yu Li yuli@cse.cuhk.edu.hk, Muxi Chen mxchen21@cse.cuhk.edu.hk or Prof. Qiang Xu qxu@cse.cuhk.edu.hk

References

[1] (ClusterMargin, 2021) Batch Active Learning at Scale

[2] (WAAL, AISTATS'20) Deep Active Learning: Unified and Principled Method for Query and Training paper code

[3] (BADGE, ICLR'20) Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds paper code

[4] (ASAL, WACV'20) Adversarial Sampling for Active Learning paper

[5] (CVPR'19) Learning Loss for Active Learning paper code

[6] (VAAL, ICCV'19) Variational Adversial Active Learning paper code

[7] (BatchBALD, NIPS'19) BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning paper code

[8] (CORE-SET, ICLR'18) Active Learning for Convolutional Neural Networks: A Core-Set Approach paper code

[9] (DFAL, 2018) Adversarial Active Learning for Deep Networks: a Margin Based Approach

[10] (NIPS'17) Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles paper code

[11] (DBAL, ICML'17) Deep Bayesian Active Learning with Image Data paper code

[12] (Least Confidence/Margin/Entropy, IJCNN'14) A New Active Labeling Method for Deep Learning, IJCNN, 2014

[13] (UncertainGCN, CoreGCN, CVPR'21) Sequential Graph Convolutional Network for Active Learning paper code

[14] (Emsemble, CVPR'18) The power of ensembles for active learning in image classification paper

[15] (Knowledge-based Systems'19) Multi-criteria active deep learning for image classification paper code

[16] (ECCV'20) Consistency-based semi-supervised active learning: Towards minimizing labeling cost paper

[17] (Google, arXiv) Combining MixMatch and Active Learning for Better Accuracy with Fewer Labels

[18] (Google, NIPS’20) Unsupervised Data Augmentation for Consistency Training

About

An implementation of the state-of-the-art Deep Active Learning algorithms

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published