deephyper.hpo#
Hyperparameter optimization and neural architecture search subpackage.
Warning
All optimization algorithms are MAXIMIZING the objective function. If you want to MINIMIZE instead, you should remember to return the negative of your objective.
Classes
Centralized Bayesian Optimisation Search. |
|
Centralized Experimental Design Search. |
|
Class to define an hyperparameter problem. |
|
Distributed Bayesian Optimization Search using MPI to launch parallel search instances. |
|
Random search algorithm used as an example for the API to implement new search algorithms. |
|
Regularized evolution algorithm. |
|
Abstract class which represents a search algorithm. |