Skip to content

realityengines/local_search

Repository files navigation

Local Search for NAS

Note: this repository has been combined with other NAS projects into naszilla/naszilla, and this repo is deprecated and not maintained. Please use naszilla/naszilla, which has more functionality.

Local Search is State of the Art for Neural Architecture Search Benchmarks
Colin White, Sam Nolen, and Yash Savani.
arXiv:2005.02960.

We study the simplest versions of local search, showing that local search achieves state-of-the-art results on NASBench-101 (size 10^6) and NASBench-201 (size 10^4). We also show that local search fails on the DARTS search space (size 10^18). This suggests that existing NAS benchmarks may be too small to adequately evaluate NAS algorithms. See our paper for a theoretical study which characterizes the performance of local search on graph optimization problems, backed by simulation results.

structured

In the left figure, each point is an architecture from NAS-Bench-201 trained on CIFAR10, and each edge denotes the LS function. We plotted the trees of the nine architectures with the lowest test losses. The right figure is similar, but the architectures are assigned validation losses at random. We see that we are much more likely to converge to an architecture with low loss on structured data (CIFAR10) rather than unstructured (random) data.

Requirements

This repo is our fork of naszilla/bananas. The requirements are as follows.

  • jupyter
  • tensorflow == 1.14.0
  • nasbench (follow the installation instructions here)
  • nas-bench-201 (follow the installation instructions here)
  • pytorch == 1.2.0, torchvision == 0.4.0 (used for experiments on the DARTS search space)
  • pybnn (used only for the DNGO baselien algorithm. Installation instructions here)

If you run experiments on DARTS, you will need the naszilla fork of the darts repo:

Run an experiment on nas-bench-101 or nas-bench-201

To run an experiment on nas-bench-101, run

python run_experiments_sequential.py

To run with nas-bench-201, add the flag --search_space nasbench_201_cifar10 to the above command with cifar10, cifar100, or imagenet.

Run an experiment on DARTS

To run an experiment on DARTS, run

bash darts/run_experiments.sh

ls_cifar10 ls_cifar100 ls_imagenet ls_baselines_101 real_synth_data uniform_preimages

Citation

Please cite our paper if you use code from this repo:

@article{white2020local,
  title={Local Search is State of the Art for Neural Architecture Search Benchmarks},
  author={White, Colin and Nolen, Sam and Savani, Yash},
  journal={arXiv preprint arXiv:2005.02960},
  year={2020}
}