Search Methods

The search module brings a modular way to implement new search algorithms and two sub modules. One is for hyperparameter search deephyper.search.hps and one is for neural architecture search deephyper.search.nas. The Search class is abstract and has different subclasses such as: deephyper.search.ambs and deephyper.search.ga.

class deephyper.search.search.Search(problem: str, run: str, evaluator: str, max_evals: int = 1000000, seed: int = None, num_nodes_master: int = 1, num_workers: int = None, **kwargs)[source]

Abstract representation of a black box optimization search.

A search comprises 3 main objects: a problem, a run function and an evaluator:

The problem class defines the optimization problem, providing details like the search domain. (You can find many kind of problems in deephyper.benchmark) The run function executes the black box function/model and returns the objective value which is to be optimized. The evaluator abstracts the run time environment (local, supercomputer…etc) in which run functions are executed.

Parameters
  • problem (str) – Module path to the Problem instance you want to use for the search (e.g. deephyper.benchmark.hps.polynome2.Problem).

  • run (str) – Module path to the run function you want to use for the search (e.g. deephyper.benchmark.hps.polynome2.run).

  • evaluator (str) – value in [‘balsam’, ‘ray’, ‘subprocess’, ‘processPool’, ‘threadPool’].

  • max_evals (int) – the maximum number of evaluations to run. The exact behavior related to this parameter can vary in different search.