xautodl/README.md

6.2 KiB

Auto Deep Learning (AutoDL)


MIT licensed

Auto Deep Learning by DXY (AutoDL-Projects) is an open source, lightweight, but useful project for researchers. In this project, Xuanyi Dong implemented several neural architecture search (NAS) and hyper-parameter optimization (HPO) algorithms. He hopes to build it as an easy-to-use AutoDL toolkit in future.

Who should consider using AutoDL-Projects

  • Beginner who want to try different AutoDL algorithms for study
  • Engineer who want to try AutoDL to investigate whether AutoDL works on your projects
  • Researchers who want to easily implement and experiement new AutoDL algorithms.

Why should we use AutoDL-Projects

  • Simplest library dependencies: each examlpe is purely relied on PyTorch or Tensorflow (except for some basic libraries in Anaconda)
  • All algorithms are in the same codebase. If you implement new algorithms, it is easy to fairly compare with many other baselines.
  • I will actively support this project, because all my furture AutoDL research will be built upon this project.

AutoDL-Projects Capabilities

At the moment, this project provides the following algorithms and scripts to run them. Please see the details in the link provided in the description column.

Type Algorithms Description
NAS Network Pruning via Transformable Architecture Search NIPS-2019-TAS.md
Searching for A Robust Neural Architecture in Four GPU Hours CVPR-2019-GDAS.md
One-Shot Neural Architecture Search via Self-Evaluated Template Network ICCV-2019-SETN.py
NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search NAS-Bench-201.md
ENAS / DARTS / REA / REINFORCE / BOHB NAS-Bench-201.md
HPO coming soon coming soon
Basic Deep Learning-based Image Classification BASELINE.md

History of this repo

At first, this repo is GDAS, which is used to reproduce results in Searching for A Robust Neural Architecture in Four GPU Hours. After that, more functions and more NAS algorithms are continuely added in this repo. After it supports more than five algorithms, it is upgraded from GDAS to NAS-Project. Now, since both HPO and NAS are supported in this repo, it is upgraded from NAS-Project to AutoDL-Projects.

Requirements and Preparation

Please install Python>=3.6 and PyTorch>=1.3.0. (You could also run this project in lower versions of Python and PyTorch, but may have bugs). Some visualization codes may require opencv.

CIFAR and ImageNet should be downloaded and extracted into $TORCH_HOME. Some methods use knowledge distillation (KD), which require pre-trained models. Please download these models from Google Drive (or train by yourself) and save into .latent-data.

Citation

If you find that this project helps your research, please consider citing some of the following papers:

@inproceedings{dong2020nasbench201,
  title     = {NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search},
  author    = {Dong, Xuanyi and Yang, Yi},
  booktitle = {International Conference on Learning Representations (ICLR)},
  url       = {https://openreview.net/forum?id=HJxyZkBKDr},
  year      = {2020}
}
@inproceedings{dong2019tas,
  title     = {Network Pruning via Transformable Architecture Search},
  author    = {Dong, Xuanyi and Yang, Yi},
  booktitle = {Neural Information Processing Systems (NeurIPS)},
  year      = {2019}
}
@inproceedings{dong2019one,
  title     = {One-Shot Neural Architecture Search via Self-Evaluated Template Network},
  author    = {Dong, Xuanyi and Yang, Yi},
  booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
  pages     = {3681--3690},
  year      = {2019}
}
@inproceedings{dong2019search,
  title     = {Searching for A Robust Neural Architecture in Four GPU Hours},
  author    = {Dong, Xuanyi and Yang, Yi},
  booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  pages     = {1761--1770},
  year      = {2019}
}
  • Awesome-NAS : A curated list of neural architecture search and related resources.
  • AutoML Freiburg-Hannover : A website maintained by Frank Hutter's team, containing many AutoML resources.

License

The entire codebase is under MIT license