deephyper.skopt#
Scikit-Optimize, or skopt, is a simple and efficient library to minimize (very) expensive and noisy black-box functions. It implements several methods for sequential model-based optimization. skopt is reusable in many contexts and accessible.
This fork of skopt 0.9.8 has been modified for compatibility with
DeepHyper. It should not be called directly since its parameters are
set from within the deephyper.search.Search
class.
Note that numerous additional options have been added to support additional
features of DeepHyper.
Additional subdirectories have been added containing code
to support these features.
Note that DeepHyper users should not need to directly interact with the
interfaces in this module. All options are set from the Search class.
The one exception to this are the utility functions and additional
multiobjective documentation provided in
deephyper.skopt.moo
, which may be of external interest.
This module is included for the purpose of:
optimization developers to understand DeepHyper’s optimization techniques and
- additional documentation – because certain optimization-related features
(such as multiobjective features in
deephyper.skopt.moo
) are too nuanced to fully documented in thedeephyper.search.Search
class’s docstrings, and can be better understood through their documentation pages here.
Also note that although DeepHyper generally deals in maximization problems, this module is for minimization. Each search class should automatically convert the problems appropriately before referencing this module.
In general, the deephyper.skopt.optimizer.optimizer.Optimizer
class defines the basic functionality, which
is used by the functions in deephyper.skopt.optimizer
.
From within DeepHyper deephyper.search.Search
, the
surrogate_model
(str type) argument selects which of these functions
is called.
Other notable modules include deephyper.skopt.acquisition
and
deephyper.skopt.sampler
, selected by similar str arguments in
deephyper.search.Search
.
The multiobjective functionality (defined by deephyper.search.Search
’s
moo_scalarization_strategy
arg) is defined in the
deephyper.skopt.moo
module, and used by the deephyper.skopt.optimizer.optimizer.Optimizer
.
Functions
Random search by uniform sampling within the given bounds. |
|
Store an skopt optimization result into a file. |
|
Compute the minimum over the predictions of the last surrogate model. |
|
Sequential optimisation using decision trees. |
|
Sequential optimization using gradient boosted trees. |
|
Bayesian optimization using Gaussian Processes. |
|
Reconstruct a skopt optimization result from a file persisted with deephyper.skopt.dump. |
Classes
Bayesian optimization over hyper parameters. |
|
Run bayesian optimisation loop in DeepHyper. |
|
Initialize a search space from given specifications. |
A collection of benchmark problems. |
|
Monitor and influence the optimization procedure via callbacks. |
|
Machine learning extensions for model-based optimization. |
|
DeepHyper's multiobjective features. |
|
Utilities for generating initial sequences |
|
Utilities to define a search space. |
|