.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "examples/plot_notify_failures_hyperparameter_search.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_examples_plot_notify_failures_hyperparameter_search.py: Notify Failures in Hyperparameter optimization ============================================== **Author(s)**: Romain Egele. This example demonstrates how to handle failure of objectives in hyperparameter search. In many cases such as software auto-tuning (where we minimize the run-time of a software application) some configurations can create run-time errors and therefore no scalar objective is returned. A default choice could be to return in this case the worst case objective if known and it can be done inside the ``run``-function. Other possibilites are to ignore these configurations or to replace them with the running mean/min objective. To illustrate such a use-case we define an artificial ``run``-function which will fail when one of its input parameters is greater than 0.5. To define a failure, it is possible to return a "string" value with ``"F"`` as prefix such as: .. GENERATED FROM PYTHON SOURCE LINES 10-19 .. code-block:: default def run(config: dict) -> float: if config["y"] > 0.5: return "F_postfix" else: return config["x"] .. GENERATED FROM PYTHON SOURCE LINES 20-21 Then, we define the corresponding hyperparameter problem where ``x`` is the value to maximize and ``y`` is a value impact the appearance of failures. .. GENERATED FROM PYTHON SOURCE LINES 21-30 .. code-block:: default from deephyper.problem import HpProblem problem = HpProblem() problem.add_hyperparameter([1, 2, 4, 8, 16, 32], "x") problem.add_hyperparameter((0.0, 1.0), "y") print(problem) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Configuration space object: Hyperparameters: x, Type: Ordinal, Sequence: {1, 2, 4, 8, 16, 32}, Default: 1 y, Type: UniformFloat, Range: [0.0, 1.0], Default: 0.5 .. GENERATED FROM PYTHON SOURCE LINES 31-32 Then, we define a centralized Bayesian optimization (CBO) search (i.e., master-worker architecture) which uses the Random-Forest regressor as default surrogate model. We will compare the ``ignore`` strategy which filters-out failed configurations, the ``mean`` strategy which replaces a failure by the running mean of collected objectives and the ``min`` strategy which replaces by the running min of collected objectives. .. GENERATED FROM PYTHON SOURCE LINES 32-53 .. code-block:: default from deephyper.search.hps import CBO from deephyper.evaluator import Evaluator from deephyper.evaluator.callback import TqdmCallback results = {} max_evals = 30 for failure_strategy in ["ignore", "mean", "min"]: # for failure_strategy in ["min"]: print(f"Executing failure strategy: {failure_strategy}") evaluator = Evaluator.create( run, method="serial", method_kwargs={"callbacks": [TqdmCallback()]} ) search = CBO( problem, evaluator, filter_failures=failure_strategy, log_dir=f"search_{failure_strategy}", random_state=42, ) results[failure_strategy] = search.search(max_evals) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Executing failure strategy: ignore 0%| | 0/30 [00:00` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_notify_failures_hyperparameter_search.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_