Skip to content

chenbong/PSS-Net

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Prioritized Subnet Sampling for Resource-Adaptive Supernet Training (paper)

Tips

Any problem, please contact the first author (Email: bhchen@stu.xmu.edu.cn).

Dependencies

  • Python 3.8
  • Pytorch 1.7

Supernet Training

  1. Prepare your ImageNet dataset
  2. Run training script

The training scripts are located in the ./scripts/train directory.

./scripts/train/
├── pss-mbv1-f.sh	# train PSS-MBV1 under FLOPS Constraint
├── pss-mbv1-c.sh	# train PSS-MBV1 under CPU Latency Constraint
├── pss-mbv1-g.sh	# train PSS-MBV1 under GPU Latency Constraint
├── pss-mbv1-m.sh	# train PSS-MBV1 under FLOPS / CPU Latency / GPU Latency Constraint
├── pss-mbv2-f.sh	# train PSS-MBV2 under FLOPs Constraint
├── pss-mbv2-c.sh	# train PSS-MBV2 under CPU Latency Constraint
├── pss-mbv2-g.sh	# train PSS-MBV2 under GPU Latency Constraint
└── pss-mbv2-m.sh	# train PSS-MBV2 under FLOPS / CPU Latency / GPU Latency Constraint

Evaluate Our Result

  1. Prepare your ImageNet dataset
  2. Download PSS-Net checkpoint from the link in the table below
  3. Run evaluation script

The evaluation scripts are located in the ./scripts/test directory.

./scripts/test/
├── pss-mbv1-f.sh	# evaluate PSS-MBV1 under FLOPS Constraint
├── pss-mbv1-c.sh	# evaluate PSS-MBV1 under CPU Latency Constraint
├── pss-mbv1-g.sh	# evaluate PSS-MBV1 under GPU Latency Constraint
├── pss-mbv1-m.sh	# evaluate PSS-MBV1 under FLOPS / CPU Latency / GPU Latency Constraint
├── pss-mbv2-f.sh	# evaluate PSS-MBV2 under FLOPs Constraint
├── pss-mbv2-c.sh	# evaluate PSS-MBV2 under CPU Latency Constraint
├── pss-mbv2-g.sh	# evaluate PSS-MBV2 under GPU Latency Constraint
└── pss-mbv2-m.sh	# evaluate PSS-MBV2 under FLOPS / CPU Latency / GPU Latency Constraint

ImageNet Results

Our Trained Supernets and Typical Subnets Reported in the Paper

SuperNet Link Constraint Type Subnet1 Subnet2 Subnet3 Subnet4 Subnet5 Subnet6 Subnet7 Subnet8 Subnet9 Subnet10 ...
PSS-MBV1-F FLOPs 555M
74.2%
496M
73.8%
443M
73.6%
375M
73.2%
343M
72.7%
310M
72.4%
306M
72.3%
273M
72.1%
126M
68.1%
107M
67.9%
...
PSS-MBV1-G GPU Latency 238us
74.4%
220us
73.92%
201us
73.5%
172us
73.3%
148us
72.6%
128us
71.9%
111us
70.9%
94us
69.8%
76us
68.1%
67us
67.9%
...
PSS-MBV1-C CPU Latency 32ms
74.2%
30ms
73.9%
28ms
73.6%
26ms
73.5
23ms
73.2%
20ms
72.5%
18ms
71.5%
16ms
70.2%
13ms
68.3%
11ms
68.0%
...
PSS-MBV1-M FLOPs 562M
74.5%
511M
74.1%
458M
73.6%
419M
73.3%
360M
73.1%
309M
73.5%
273M
72.1%
203M
70.6%
118M
68.1%
107M
68.0%
...
GPU Latency 236us
74.5%
221us
74.1%
197us
73.4%
166us
73.1%
141us
72.4%
120us
71.5%
100us
70.0%
88us
68.8%
71us
68.1%
69us
68.0%
...
CPU Latency 30ms
74.5%
28ms
73.8%
26ms
73.4%
23ms
73.1%
21ms
72.5%
18ms
71.5%
16ms
70.8%
14ms
69.6%
13ms
68.6%
11ms
68.0%
...
PSS-MBV2-F FLOPs 301M
73.4%
294M
73.3%
277M
72.8%
241M
72.4%
203M
72.1%
163M
70.9%
154M
70.7%
134M
70.0%
84M
66.8%
69M
66.4%
...
PSS-MBV2-G GPU Latency 260us
73.4%
252us
72.8%
234us
72.7%
225us
72.6%
197us
72.5%
179us
72.0%
165us
71.0%
145us
70.4%
126us
68.7%
101us
66.5%
...
PSS-MBV2-C CPU Latency 42ms
73.3%
30ms
73.2%
27ms
73.0%
25ms
72.5%
23ms
72.2%
21ms
71.0%
19ms
70.3%
18ms
69.5%
17ms
69.0%
16ms
66.7%
...
PSS-MBV2-M FLOPs 301M
73.4%
295M
73.4%
249M
72.7%
192M
71.9%
163M
70.9%
154M
70.7%
132M
70.0%
104M
68.7%
83M
66.7%
69M
66.4%
...
GPU Latency 260us
73.4%
226us
72.9%
202us
72.5%
197us
72.4%
174us
71.9%
150us
70.7%
139us
70.0%
121us
67.5%
102us
66.7%
101us
66.4%
...
CPU Latency 33ms
73.4%
32ms
73.2%
31ms
73.1%
29ms
72.9%
27ms
72.4%
24ms
72.2%
22ms
70.7%
20ms
69.4%
18ms
68.4%
16ms
66.6%
...

We only list the results of some subnets of the supernet, if you want to get the results of all subnets in the supernet, you can check the training logs or download the supernet weights for evaluation by using the links in the first column of the table.

main_page-0001

Running the following command to get the above figure:

python ./scripts/plot_main_result.py

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published