DeepHyper: Scalable Neural Architecture and Hyperparameter Search for Deep Neural Networks¶
DeepHyper is a scalable automated machine learning (AutoML) package for developing deep neural networks for scientific applications. It comprises two components:
Neural Architecture Search (NAS): fully-automated search for high-performing deep neural network architectures
Hyperparameter Search for Deep Learning (Basic): optimizing hyperparameters for a given reference model
DeepHyper provides an infrastructure that targets experimental research in NAS and HPS methods, scalability, and portability across diverse supercomputers. It comprises three modules:
Benchmarks Package: Tools for defining NAS and HPS problems, as well as a curated set of sample benchmark problems for judging the efficacy of novel search algorithms.
Evaluator Interface: A simple interface for NAS and HPS codes to dispatch model evaluation tasks. Implementations range from subprocess for laptop experiments to ray and balsam for large-scale runs on HPC systems.
Search Methods: Search methods for NAS and HPS. By extending the generic Search class, one can easily add new NAS or HPS methods to DeepHyper.
DeepHyper installation requires Python 3.7.