Skip to content

lottery-jackpot/lottery-jackpot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Lottery Jackpots Exist in Pre-trained Models

Requirements

  • Python >= 3.7.4
  • Pytorch >= 1.6.1
  • Torchvision >= 0.4.1

Reproduce the Experiment Results

  1. Download the pre-trained models from this link and place them in the pre-train folder.

  2. Select a configuration file in configs to reproduce the experiment results reported in the paper. For example, to find a lottery jackpot with 30 epochs for pruning 95% parameters of ResNet-32 on CIFAR-10, run:

    python cifar.py --config configs/resnet32_cifar10/90sparsity30epoch.yaml --gpus 0

    To find a lottery jackpot with 30 epochs for pruning 90% parameters of ResNet-50 on ImageNet, run:

    python imagenet.py --config configs/resnet50_imagenet/90sparsity30epoch.yaml --gpus 0

    To further tune the weights of a searched lottery jackpot with 10 epochs for pruning 90% parameters of ResNet-50 on ImageNet, run:

    python imagenet-t.py --config configs/resnet50_imagenet/90sparsity30s10t.yaml --gpus 0

    Note that the data_path in the yaml file should be changed to the data.

Evaluate Our Pruned Models

We provide configuration, training logs, and pruned models reported in the paper. They can be downloaded from the provided links in the following table:

Model Dataset Sparsity Epoch Top-1 Acc. Link
VGGNet-19 CIFAR-10 90% 30(S) 93.88% link
VGGNet-19 CIFAR-10 90% 160(S) 93.94% link
VGGNet-19 CIFAR-10 95% 30(S) 93.49% link
VGGNet-19 CIFAR-10 95% 160(S) 93.74% link
VGGNet-19 CIFAR-100 90% 30(S) 72.59% link
VGGNet-19 CIFAR-100 90% 160(S) 74.61% link
VGGNet-19 CIFAR-100 95% 30(S) 71.76% link
VGGNet-19 CIFAR-100 95% 160(S) 73.35% link
ResNet-32 CIFAR-10 90% 30(S) 93.70% link
ResNet-32 CIFAR-10 90% 160(S) 94.39% link
ResNet-32 CIFAR-10 95% 30(S) 92.90% link
ResNet-32 CIFAR-10 95% 160(S) 93.41% link
ResNet-32 CIFAR-100 90% 30(S) 72.22% link
ResNet-32 CIFAR-100 90% 160(S) 73.43% link
ResNet-32 CIFAR-100 95% 30(S) 69.38% link
ResNet-32 CIFAR-100 95% 160(S) 70.31% link
ResNet-50 ImageNet 80% 30(S) 75.19% link
ResNet-50 ImageNet 80% 30(S)+10(T) 76.66% link
ResNet-50 ImageNet 90% 30(S) 72.43% link
ResNet-50 ImageNet 90% 30(S)+10(T) 74.62% link

To test our pruned models, download the pruned models and place them in the ckpt folder.

  1. Select a configuration file in configs to test the pruned models. For example, to evaluate a lottery jackpot for pruning ResNet-32 on CIFAR-10, run:

    python evaluate.py --config configs/resnet32_cifar10/evaluate.yaml --gpus 0

    To evaluate a lottery jackpot for pruning ResNet-50 on ImageNet, run:

    python evaluate.py --config configs/resnet50_imagenet/evaluate.yaml --gpus 0

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages