DeepHyper: Scalable Neural Architecture and Hyperparameter Search for Deep Neural Networks
Contents
DeepHyper: Scalable Neural Architecture and Hyperparameter Search for Deep Neural Networks#

DeepHyper is a distributed machine learning (AutoML) package for automating the development of deep neural networks for scientific applications. It can run on a single laptop as well as on 1,000 of nodes.
It comprises different tools such as:
Optimizing hyper-parameters for a given black-box function.
Neural architecture search to discover high-performing deep neural network with variable operations and connections.
Automated machine learning, to easily experiment many learning algorithms from Scikit-Learn.
DeepHyper provides an infrastructure that targets experimental research in NAS and HPS methods, scalability, and portability across diverse supercomputers. It comprises three main modules:
deephyper.problem
: Tools for defining neural architecture and hyper-parameter search problems.deephyper.evaluator
: A simple interface to dispatch model evaluation tasks. Implementations range from process for laptop experiments to ray for large-scale runs on HPC systems.deephyper.search
: Search methods for NAS and HPS. By extending the generic Search class, one can easily add new NAS or HPS methods to DeepHyper.
DeepHyper installation requires Python >= 3.7.
What is DeepHyper#
DeepHyper is a powerful Python package for automating machine learning tasks, particularly focused on optimizing hyperparameters, searching for optimal neural architectures, and quantifying uncertainty through the use of deep ensembles. With DeepHyper, users can easily perform these tasks on a single machine or distributed across multiple machines, making it ideal for use in a variety of environments. Whether you’re a beginner looking to optimize your machine learning models or an experienced data scientist looking to streamline your workflow, DeepHyper has something to offer. So why wait? Start using DeepHyper today and take your machine learning skills to the next level!
Install instructions#
Install with pip
# For the most basic set of features (hyperparameter search)
pip install deephyper
# For the default set of features including:
# - hyperparameter search with transfer-learning
# - neural architecture search
# - deep ensembles
# - Ray-based distributed computing
# - Learning-curve extrapolation for multi-fidelity hyperparameter search
pip install "deephyper[default]"
More details about the installation process can be found at DeepHyper Installations.
Quick Start#
The black-box function named run
is defined by taking an input dictionnary named config
which contains the different variables to optimize. Then the run-function is binded to an Evaluator
in charge of distributing the computation of multiple evaluations. Finally, a Bayesian search named CBO
is created and executed to find the values of config which maximize the return value of run(config)
.
def run(job):
# The suggested parameters are accessible in job.parameters (dict)
x = job.parameters["x"]
b = job.parameters["b"]
if job.parameters["function"] == "linear":
y = x + b
elif job.parameters["function"] == "cubic":
y = x**3 + b
# Maximization!
return y
# Necessary IF statement otherwise it will enter in a infinite loop
# when loading the 'run' function from a new process
if __name__ == "__main__":
from deephyper.problem import HpProblem
from deephyper.search.hps import CBO
from deephyper.evaluator import Evaluator
# define the variable you want to optimize
problem = HpProblem()
problem.add_hyperparameter((-10.0, 10.0), "x") # real parameter
problem.add_hyperparameter((0, 10), "b") # discrete parameter
problem.add_hyperparameter(["linear", "cubic"], "function") # categorical parameter
# define the evaluator to distribute the computation
evaluator = Evaluator.create(
run,
method="process",
method_kwargs={
"num_workers": 2,
},
)
# define your search and execute it
search = CBO(problem, evaluator, random_state=42)
results = search.search(max_evals=100)
print(results)
Which outputs the following results where the best parameters are with function == "cubic"
, x == 9.99
and b == 10
.
p:b p:function p:x objective job_id m:timestamp_submit m:timestamp_gather
0 7 linear 8.831019 15.831019 1 0.064874 1.430992
1 4 linear 9.788889 13.788889 0 0.064862 1.453012
2 0 cubic 2.144989 9.869049 2 1.452692 1.468436
3 9 linear -9.236860 -0.236860 3 1.468123 1.483654
4 2 cubic -9.783865 -934.550818 4 1.483340 1.588162
.. ... ... ... ... ... ... ...
95 6 cubic 9.862098 965.197192 95 13.538506 13.671872
96 10 cubic 9.997512 1009.253866 96 13.671596 13.884530
97 6 cubic 9.965615 995.719961 97 13.884188 14.020144
98 5 cubic 9.998324 1004.497422 98 14.019737 14.154467
99 9 cubic 9.995800 1007.740379 99 14.154169 14.289366
The code defines a function run
that takes a RunningJob job
as input and returns the maximized objective y
. The if
block at the end of the code defines a black-box optimization process using the CBO
(Centralized Bayesian Optimization) algorithm from the deephyper
library.
The optimization process is defined as follows:
A hyperparameter optimization problem is created using the
HpProblem
class fromdeephyper
. In this case, the problem has a three variables. Thex
hyperparameter is a real variable in a range from -10.0 to 10.0. Theb
hyperparameter is a discrete variable in a range from 0 to 10. Thefunction
hyperparameter is a categorical variable with two possible values.An evaluator is created using the
Evaluator.create
method. The evaluator will be used to evaluate the functionrun
with different configurations of suggested hyperparameters in the optimization problem. The evaluator uses theprocess
method to distribute the evaluations across multiple worker processes, in this case 2 worker processes.A search object is created using the
CBO
class, the problem and evaluator defined earlier. TheCBO
algorithm is a derivative-free optimization method that uses a Bayesian optimization approach to explore the hyperparameter space.The optimization process is executed by calling the
search.search
method, which performs the evaluations of therun
function with different configurations of the hyperparameters until a maximum number of evaluations (100 in this case) is reached.The results of the optimization process, including the optimal configuration of the hyperparameters and the corresponding objective value, are printed to the console.
Table of Contents#
Get Started
API Reference
- Core
- deephyper.core.cli
- deephyper.core.exceptions
- deephyper.core.exceptions.DeephyperError
- deephyper.core.exceptions.DeephyperRuntimeError
- deephyper.core.exceptions.MissingRequirementError
- deephyper.core.exceptions.RunFunctionError
- deephyper.core.exceptions.SearchTerminationError
- deephyper.core.exceptions.loading
- deephyper.core.exceptions.nas
- deephyper.core.exceptions.nas.DeephyperError
- deephyper.core.exceptions.nas.NASError
- deephyper.core.exceptions.nas.space
- deephyper.core.exceptions.nas.space.InputShapeOfWrongType
- deephyper.core.exceptions.nas.space.NASError
- deephyper.core.exceptions.nas.space.NodeAlreadyAdded
- deephyper.core.exceptions.nas.space.StructureHasACycle
- deephyper.core.exceptions.nas.space.WrongOutputShape
- deephyper.core.exceptions.nas.space.WrongSequenceToSetOperations
- deephyper.core.exceptions.problem
- deephyper.core.exceptions.problem.DeephyperError
- deephyper.core.exceptions.problem.NaProblemError
- deephyper.core.exceptions.problem.ProblemLoadDataIsNotCallable
- deephyper.core.exceptions.problem.ProblemPreprocessingIsNotCallable
- deephyper.core.exceptions.problem.SearchSpaceBuilderIsNotCallable
- deephyper.core.exceptions.problem.SearchSpaceBuilderMissingDefaultParameter
- deephyper.core.exceptions.problem.SearchSpaceBuilderMissingParameter
- deephyper.core.exceptions.problem.SpaceDimNameOfWrongType
- deephyper.core.exceptions.problem.WrongProblemObjective
- deephyper.core.parser
- deephyper.core.utils
- Ensemble
- Evaluator
- deephyper.evaluator.distributed
- deephyper.evaluator.parse_subprocess_result
- deephyper.evaluator.profile
- deephyper.evaluator.queued
- deephyper.evaluator.to_json
- deephyper.evaluator.Evaluator
- deephyper.evaluator.Job
- deephyper.evaluator.MPICommEvaluator
- deephyper.evaluator.ProcessPoolEvaluator
- deephyper.evaluator.RayEvaluator
- deephyper.evaluator.RunningJob
- deephyper.evaluator.SerialEvaluator
- deephyper.evaluator.ThreadPoolEvaluator
- deephyper.evaluator.callback
- deephyper.evaluator.storage
- Keras
- deephyper.keras.callbacks
- deephyper.keras.callbacks.import_callback
- deephyper.keras.callbacks.CSVExtendedLogger
- deephyper.keras.callbacks.LearningRateScheduleCallback
- deephyper.keras.callbacks.LearningRateWarmupCallback
- deephyper.keras.callbacks.StopIfUnfeasible
- deephyper.keras.callbacks.TimeStopping
- deephyper.keras.callbacks.csv_extended_logger
- deephyper.keras.callbacks.learning_rate_warmup
- deephyper.keras.callbacks.stop_if_unfeasible
- deephyper.keras.callbacks.stop_on_timeout
- deephyper.keras.callbacks.time_stopping
- deephyper.keras.callbacks.utils
- deephyper.keras.layers
- deephyper.keras.layers.AttentionCOS
- deephyper.keras.layers.AttentionConst
- deephyper.keras.layers.AttentionGAT
- deephyper.keras.layers.AttentionGCN
- deephyper.keras.layers.AttentionGenLinear
- deephyper.keras.layers.AttentionLinear
- deephyper.keras.layers.AttentionSymGAT
- deephyper.keras.layers.GlobalAttentionPool
- deephyper.keras.layers.GlobalAttentionSumPool
- deephyper.keras.layers.GlobalAvgPool
- deephyper.keras.layers.GlobalMaxPool
- deephyper.keras.layers.GlobalSumPool
- deephyper.keras.layers.MessagePasserNNM
- deephyper.keras.layers.MessagePassing
- deephyper.keras.layers.Padding
- deephyper.keras.layers.SparseMPNN
- deephyper.keras.layers.UpdateFuncGRU
- deephyper.keras.layers.UpdateFuncMLP
- deephyper.keras.utils
- deephyper.keras.callbacks
- NAS
- deephyper.nas.KSearchSpace
- deephyper.nas.NxSearchSpace
- deephyper.nas.losses
- deephyper.nas.lr_scheduler
- deephyper.nas.metrics
- deephyper.nas.metrics.acc
- deephyper.nas.metrics.load_attr
- deephyper.nas.metrics.mae
- deephyper.nas.metrics.mse
- deephyper.nas.metrics.r2
- deephyper.nas.metrics.rmse
- deephyper.nas.metrics.selectMetric
- deephyper.nas.metrics.sparse_perplexity
- deephyper.nas.metrics.tfp_mae
- deephyper.nas.metrics.tfp_mse
- deephyper.nas.metrics.tfp_r2
- deephyper.nas.metrics.tfp_rmse
- deephyper.nas.metrics.to_tfp
- deephyper.nas.metrics.OrderedDict
- deephyper.nas.node
- deephyper.nas.operation
- deephyper.nas.operation.operation
- deephyper.nas.operation.AddByPadding
- deephyper.nas.operation.AddByProjecting
- deephyper.nas.operation.Concatenate
- deephyper.nas.operation.Connect
- deephyper.nas.operation.Identity
- deephyper.nas.operation.Operation
- deephyper.nas.operation.Tensor
- deephyper.nas.operation.Zero
- deephyper.nas.preprocessing
- deephyper.nas.run
- deephyper.nas.spacelib
- deephyper.nas.trainer
- Problem
- deephyper.problem.Categorical
- deephyper.problem.Float
- deephyper.problem.Integer
- deephyper.problem.AndConjunction
- deephyper.problem.Beta
- deephyper.problem.BetaFloatHyperparameter
- deephyper.problem.BetaIntegerHyperparameter
- deephyper.problem.CategoricalHyperparameter
- deephyper.problem.Configuration
- deephyper.problem.ConfigurationSpace
- deephyper.problem.Constant
- deephyper.problem.Distribution
- deephyper.problem.EqualsCondition
- deephyper.problem.ForbiddenAndConjunction
- deephyper.problem.ForbiddenEqualsClause
- deephyper.problem.ForbiddenEqualsRelation
- deephyper.problem.ForbiddenGreaterThanRelation
- deephyper.problem.ForbiddenInClause
- deephyper.problem.ForbiddenLessThanRelation
- deephyper.problem.GreaterThanCondition
- deephyper.problem.HpProblem
- deephyper.problem.InCondition
- deephyper.problem.LessThanCondition
- deephyper.problem.NaProblem
- deephyper.problem.Normal
- deephyper.problem.NormalFloatHyperparameter
- deephyper.problem.NormalIntegerHyperparameter
- deephyper.problem.NotEqualsCondition
- deephyper.problem.OrConjunction
- deephyper.problem.OrdinalHyperparameter
- deephyper.problem.UnParametrizedHyperparameter
- deephyper.problem.Uniform
- deephyper.problem.UniformFloatHyperparameter
- deephyper.problem.UniformIntegerHyperparameter
- Search
- Sklearn
- Stopper