deephyper.skopt#

Scikit-Optimize, or skopt, is a simple and efficient library to minimize (very) expensive and noisy black-box functions. It implements several methods for sequential model-based optimization. skopt is reusable in many contexts and accessible.

This fork of skopt 0.9.8 has been modified for compatibility with DeepHyper. It should not be called directly since its parameters are set from within the deephyper.search.Search class. Note that numerous additional options have been added to support additional features of DeepHyper. Additional subdirectories have been added containing code to support these features.

Note that DeepHyper users should not need to directly interact with the interfaces in this module. All options are set from the Search class. The one exception to this are the utility functions and additional multiobjective documentation provided in deephyper.skopt.moo, which may be of external interest.

This module is included for the purpose of:

  1. optimization developers to understand DeepHyper’s optimization techniques and

  2. additional documentation – because certain optimization-related features

    (such as multiobjective features in deephyper.skopt.moo) are too nuanced to fully documented in the deephyper.search.Search class’s docstrings, and can be better understood through their documentation pages here.

Also note that although DeepHyper generally deals in maximization problems, this module is for minimization. Each search class should automatically convert the problems appropriately before referencing this module.

In general, the deephyper.skopt.optimizer.optimizer.Optimizer class defines the basic functionality, which is used by the functions in deephyper.skopt.optimizer. From within DeepHyper deephyper.search.Search, the surrogate_model (str type) argument selects which of these functions is called. Other notable modules include deephyper.skopt.acquisition and deephyper.skopt.sampler, selected by similar str arguments in deephyper.search.Search.

The multiobjective functionality (defined by deephyper.search.Search’s moo_scalarization_strategy arg) is defined in the deephyper.skopt.moo module, and used by the deephyper.skopt.optimizer.optimizer.Optimizer.

Functions

dummy_minimize

Random search by uniform sampling within the given bounds.

dump

Store an skopt optimization result into a file.

expected_minimum

Compute the minimum over the predictions of the last surrogate model.

forest_minimize

Sequential optimisation using decision trees.

gbrt_minimize

Sequential optimization using gradient boosted trees.

gp_minimize

Bayesian optimization using Gaussian Processes.

load

Reconstruct a skopt optimization result from a file persisted with deephyper.skopt.dump.

Classes

BayesSearchCV

Bayesian optimization over hyper parameters.

Optimizer

Run bayesian optimisation loop in DeepHyper.

Space

Initialize a search space from given specifications.

deephyper.skopt.acquisition

deephyper.skopt.benchmarks

A collection of benchmark problems.

deephyper.skopt.callbacks

Monitor and influence the optimization procedure via callbacks.

deephyper.skopt.learning

Machine learning extensions for model-based optimization.

deephyper.skopt.moo

DeepHyper's multiobjective features.

deephyper.skopt.optimizer

deephyper.skopt.sampler

Utilities for generating initial sequences

deephyper.skopt.searchcv

deephyper.skopt.space

Utilities to define a search space.

deephyper.skopt.utils