deephyper.search.nas#

Sub-package for neural architecture search algorithms.

Warning

All search algorithms are MAXIMIZING the objective function. If you want to MINIMIZE the objective function, you have to return the negative of you objective.

Classes

AMBSMixed

Asynchronous Model-Based Search baised on the Scikit-Optimized Optimizer.

AgEBO

Aging evolution with Bayesian Optimization.

NeuralArchitectureSearch

Random

Random neural architecture search.

RegularizedEvolution

Regularized evolution neural architecture search.

RegularizedEvolutionMixed

Extention of the Regularized evolution neural architecture search to the case of joint hyperparameter and neural architecture search.