deephyper.search.hps.AMBS
deephyper.search.hps.AMBS#
-
class
deephyper.search.hps.
AMBS
(problem, evaluator, random_state: Optional[int] = None, log_dir: str = '.', verbose: int = 0, surrogate_model: str = 'RF', acq_func: str = 'UCB', acq_optimizer: str = 'auto', kappa: float = 1.96, xi: float = 0.001, n_points: int = 10000, filter_duplicated: bool = True, update_prior: bool = False, multi_point_strategy: str = 'cl_max', n_jobs: int = 1, n_initial_points=10, initial_points=None, sync_communication: bool = False, filter_failures: str = 'mean', **kwargs)[source]# Bases:
deephyper.search.hps._cbo.CBO
AMBS is now deprecated and will be removed in the future use ‘CBO’ (Centralized Bayesian Optimization) instead!
Methods
Dumps the context in the log folder.
Learn the distribution of hyperparameters for the top-
(1-q)x100%
configurations and sample from this distribution.Apply prior-guided transfer learning based on a DataFrame of results.
Fit the surrogate model of the search from a checkpointed Dataframe.
Execute the search algorithm.
Terminate the search.
Returns a json version of the search object.
-
dump_context
()# Dumps the context in the log folder.
-
fit_generative_model
(df, q=0.9, n_iter_optimize=0, n_samples=100)# Learn the distribution of hyperparameters for the top-
(1-q)x100%
configurations and sample from this distribution. It can be used for transfer learning.Example Usage:
>>> search = CBO(problem, evaluator) >>> search.fit_surrogate("results.csv")
- Parameters
df (str|DataFrame) – a dataframe or path to CSV from a previous search.
q (float, optional) – the quantile defined the set of top configurations used to bias the search. Defaults to
0.90
which select the top-10% configurations fromdf
.n_iter_optimize (int, optional) – the number of iterations used to optimize the generative model which samples the data for the search. Defaults to
0
with no optimization for the generative model.n_samples (int, optional) – the number of samples used to score the generative model.
- Returns
score, model
which are a metric which measures the quality of the learned generated-model and the generative model respectively.- Return type
-
fit_search_space
(df, fac_numerical=0.125, fac_categorical=10)# Apply prior-guided transfer learning based on a DataFrame of results.
Example Usage:
>>> search = CBO(problem, evaluator) >>> search.fit_surrogate("results.csv")
- Parameters
df (str|DataFrame) – a checkpoint from a previous search.
fac_numerical (float) – the factor used to compute the sigma of a truncated normal distribution based on
sigma = max(1.0, (upper - lower) * fac_numerical)
. A small large factor increase exploration while a small factor increase exploitation around the best-configuration from thedf
parameter.fac_categorical (float) – the weight given to a categorical feature part of the best configuration. A large weight
> 1
increase exploitation while a small factor close to1
increase exploration.
-
fit_surrogate
(df)# Fit the surrogate model of the search from a checkpointed Dataframe.
- Parameters
df (str|DataFrame) – a checkpoint from a previous search.
Example Usage:
>>> search = CBO(problem, evaluator) >>> search.fit_surrogate("results.csv")
-
search
(max_evals: int = - 1, timeout: Optional[int] = None)# Execute the search algorithm.
- Parameters
- Returns
a pandas DataFrame containing the evaluations performed or
None
if the search could not evaluate any configuration.- Return type
DataFrame
-
terminate
()# Terminate the search.
- Raises
SearchTerminationError – raised when the search is terminated with SIGALARM
-
to_json
()# Returns a json version of the search object.
-