deephyper.search.nas package

Subpackages

Submodules

deephyper.search.nas.ambs module

class deephyper.search.nas.ambs.AMBNeuralArchitectureSearch(problem, run, evaluator, surrogate_model='RF', liar_strategy='cl_max', acq_func='gp_hedge', n_jobs=- 1, **kwargs)[source]

Bases: deephyper.search.nas.NeuralArchitectureSearch

Asynchronous Model-Based Search.

Parameters
  • problem (str) – python attribute import of the NaProblem instance (e.g. mypackage.mymodule.myproblem).

  • run (str) – python attribute import of the run function (e.g. mypackage.mymodule.myrunfunction).

  • evaluator (str) – the name of the evaluator to use.

  • surrogate_model (str, optional) – Choices are [“RF”, “ET”, “GBRT”, “DUMMY”, “GP”]. RF is Random Forest, ET is Extra Trees, GBRT is Gradient Boosting Regression Trees, DUMMY is random, GP is Gaussian process. Defaults to “RF”.

  • liar_strategy (str, optional) – [“cl_max”, “cl_min”, “cl_mean”]. Defaults to “cl_max”.

  • acq_func (str, optional) – Acquisition function, choices are [“gp_hedge”, “LCB”, “EI”, “PI”]. Defaults to “gp_hedge”.

  • n_jobs (int, optional) – Number of parallel jobs to distribute the surrogate model (learner). Defaults to -1, means as many as the number of logical cores.

main()[source]
deephyper.search.nas.ambs.on_exit(signum, stack)[source]

deephyper.search.nas.full_random module

class deephyper.search.nas.random.Random(problem, run, evaluator, **kwargs)[source]

Bases: deephyper.search.nas.NeuralArchitectureSearch

Search class to run a full random neural architecture search. The search is filling every available nodes as soon as they are detected. The master job is using only 1 MPI rank.

Parameters
  • problem (str) – Module path to the Problem instance you want to use for the search (e.g. deephyper.benchmark.nas.linearReg.Problem).

  • run (str) – Module path to the run function you want to use for the search (e.g. deephyper.nas.run.quick).

  • evaluator (str) – value in [‘balsam’, ‘subprocess’, ‘processPool’, ‘threadPool’].

main()[source]
deephyper.search.nas.random.random() → x in the interval [0, 1).

deephyper.search.nas.regevo module

class deephyper.search.nas.regevo.RegularizedEvolution(problem, run, evaluator, population_size=100, sample_size=10, **kwargs)[source]

Bases: deephyper.search.nas.NeuralArchitectureSearch

Regularized evolution.

https://arxiv.org/abs/1802.01548

Parameters
  • problem (str) – Module path to the Problem instance you want to use for the search (e.g. deephyper.benchmark.nas.linearReg.Problem).

  • run (str) – Module path to the run function you want to use for the search (e.g. deephyper.nas.run.quick).

  • evaluator (str) – value in [‘balsam’, ‘subprocess’, ‘processPool’, ‘threadPool’].

  • population_size (int, optional) – the number of individuals to keep in the population. Defaults to 100.

  • sample_size (int, optional) – the number of individuals that should participate in each tournament. Defaults to 10.

copy_mutate_arch(parent_arch: list)dict[source]

# ! Time performance is critical because called sequentialy

Parameters

parent_arch (list(int)) – [description]

Returns

[description]

Return type

dict

gen_random_batch(size: int)list[source]
main()[source]
random_search_space()list[source]
select_parent(sample: list)list[source]

deephyper.search.nas.rl module

Module contents