Skip to content

YongHyun-Ahn/LINe-Out-of-Distribution-Detection-by-Leveraging-Important-Neurons

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LINe: Out-of-Distribution Detection by Leveraging Important Neurons

🦢 - Paper

This is the official source code for [LINe: Out-of-Distribution Detection by Leveraging Important Neurons] IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2023

Last update: 23/06/13

We updated README.md to adjust usage informations.

Usage

1. Dataset Preparation for Large-scale Experiment

In-distribution dataset

Please download ImageNet-1k and place the training data and validation data in ./datasets/ILSVRC-2012/train and ./datasets/ILSVRC-2012/val, respectively.

Out-of-distribution dataset

We have curated 4 OOD datasets from iNaturalist, SUN, Places, and Textures, and de-duplicated concepts overlapped with ImageNet-1k.

For iNaturalist, SUN, and Places, we have sampled 10,000 images from the selected concepts for each dataset, which can be download via the following links:

wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/iNaturalist.tar.gz
wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/SUN.tar.gz
wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/Places.tar.gz

For Textures, we use the entire dataset, which can be downloaded from their original website.

Please put all downloaded OOD datasets into ./datasets/.

2. Dataset Preparation for CIFAR Experiment

In-distribution dataset

The downloading process will start immediately upon running.

Out-of-distribution dataset

We provide links and instructions to download each dataset:

  • SVHN: download it and place it in the folder of ./datasets/ood_datasets/svhn. Then run python select_svhn_data.py to generate test subset.
  • Textures: download it and place it in the folder of ./datasets/ood_datasets/dtd.
  • Places365: download it and place it in the folder of ./datasets/ood_datasets/places365/test_subset. We randomly sample 10,000 images from the original test dataset.
  • LSUN-C: download it and place it in the folder of ./datasets/ood_datasets/LSUN.
  • LSUN-R: download it and place it in the folder of ./datasets/ood_datasets/LSUN_resize.
  • iSUN: download it and place it in the folder of ./datasets/ood_datasets/iSUN.

For example, run the following commands in the root directory to download LSUN-C:

cd datasets/ood_datasets
wget https://www.dropbox.com/s/fhtsw1m3qxlwj6h/LSUN.tar.gz
tar -xvzf LSUN.tar.gz

3. Pre-trained Model Preparation

For CIFAR, the model we used in the paper is already in the checkpoints folder.

For ImageNet, the model we used in the paper is the pre-trained ResNet-50 provided by Pytorch. The download process will start upon running.

Preliminaries

It is tested under Ubuntu Linux 20.04 and Python 3.8 environment, and requries some packages to be installed:

Precompute

LINe need precomputing for calculate Shapley value approximation.

Run ./precompute.py.

Demo

1. Demo code for Large-scale Experiment

Run ./demo-imagenet.sh.

2. Demo code for CIFAR Experiment

Run ./demo-cifar.sh.

This codebase is from DICE (Sun et al. ECCV2022) https://github.com/deeplearning-wisc/dice

About

LINe: Out-of-Distribution Detection by Leveraging Important Neurons (CVPR 2023)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published