Skip to content

Latest commit

 

History

History

Few-Shot_NasBench201

Few-shot NAS on NASBench-201

Few-shot NAS Greatly Improves Different NAS Algorithms on NASBench-201

How to Use Few-shot NAS to Reproduce above Results

Environment Requirements

python >= 3.6, numpy >= 1.9.1, torch >= 1.5.0, hpbandster, json

Download the dataset

The full NASBench-201 dataset can be found at here(4.7G).

Gradient-based NAS Algorithms

  • DARTS: Differentiable Architecture Search
  • PCDARTS: Partial Channel Connections for Memory-Efficient Architecture Search
  • ENAS: Efficient Neural Architecture Search via Parameter Sharing
  • SETN: One-Shot Neural Architecture Search via Self-Evaluated Template Network

Vanilla NAS Algorithms

  • REA: Regularized Evolution for Image Classifier Architecture Search
  • RL: Learning Transferable Architectures for Scalable Image Recognition
  • BOHB: Robust and Efficient Hyperparameter Optimization at Scale
  • TPE: Algorithms for Hyper-Parameter Optimization

For one(few)-shot vanilla NAS algorithms, the search is guided by the individual architecture performance, which is estimated by the supernet. Therefore, we provide the estimated accuracies of all 15625 architectures in NasBench201, which are approximated by both one-shot supernet and few-shot supernets. These files are located on ./supernet_info. Folder './supernet_info' also contains a file named 'nasbench201', which contains the real accuracies of architectures in NasBench201.

If you would like to train and evaluate the supernet(s) by yourself, please follow the instruction here.