deephyper.evaluator.ProcessPoolEvaluator
deephyper.evaluator.ProcessPoolEvaluator#
-
class
deephyper.evaluator.
ProcessPoolEvaluator
(run_function: Callable, num_workers: int = 1, callbacks: Optional[list] = None, run_function_kwargs: Optional[dict] = None, storage: Optional[deephyper.evaluator.storage._storage.Storage] = None, search_id: Optional[Hashable] = None)[source]# Bases:
deephyper.evaluator._evaluator.Evaluator
This evaluator uses the
ProcessPoolExecutor
as backend.- Parameters
run_function (callable) – functions to be executed by the
Evaluator
.num_workers (int, optional) – Number of parallel processes used to compute the
run_function
. Defaults to 1.callbacks (list, optional) – A list of callbacks to trigger custom actions at the creation or completion of jobs. Defaults to None.
Methods
Convert an input value to an accepted format to be saved as a value of a CSV file (e.g., a list becomes it’s str representation).
Create evaluator with a specific backend and configuration.
Decode the key following a JSON format to return a dict.
Dump evaluations to a CSV file.
Execute the received job.
Collect the completed tasks from the evaluator in batches of one or more.
Set a timeout for the Evaluator.
Send configurations to be evaluated by available workers.
Returns a json version of the evaluator.
Attributes
FAIL_RETURN_VALUE
NEST_ASYNCIO_PATCHED
PYTHON_EXE
-
convert_for_csv
(val)# Convert an input value to an accepted format to be saved as a value of a CSV file (e.g., a list becomes it’s str representation).
- Parameters
val (Any) – The input value to convert.
- Returns
The converted value.
- Return type
Any
-
static
create
(run_function, method='serial', method_kwargs={})# Create evaluator with a specific backend and configuration.
- Parameters
run_function (function) – the function to execute in parallel.
method (str, optional) – the backend to use in
["serial", "thread", "process", "ray", "mpicomm"]
. Defaults to"serial"
.method_kwargs (dict, optional) – configuration dictionnary of the corresponding backend. Keys corresponds to the keyword arguments of the corresponding implementation. Defaults to “{}”.
- Raises
ValueError – if the
method is
not acceptable.- Returns
the
Evaluator
with the corresponding backend and configuration.- Return type
-
decode
(key)# Decode the key following a JSON format to return a dict.
-
dump_evals
(saved_keys=None, log_dir: str = '.', filename='results.csv')# Dump evaluations to a CSV file.
- Parameters
saved_keys (list|callable) – If
None
the wholejob.config
will be added as row of the CSV file. If alist
filtered keys will be added as a row of the CSV file. If acallable
the output dictionnary will be added as a row of the CSV file.log_dir (str) – directory where to dump the CSV file.
filename (str) – name of the file where to write the data.
-
async
execute
(job: deephyper.evaluator._job.Job) → deephyper.evaluator._job.Job[source]# Execute the received job. To be implemented with a specific backend.
- Parameters
job (Job) – the
Job
to be executed.
-
gather
(type, size=1)# Collect the completed tasks from the evaluator in batches of one or more.
- Parameters
type (str) –
- Options:
"ALL"
Block until all jobs submitted to the evaluator are completed.
"BATCH"
Specify a minimum batch size of jobs to collect from the evaluator. The method will block until at least
size
evaluations are completed.
size (int, optional) – The minimum batch size that we want to collect from the evaluator. Defaults to 1.
- Raises
Exception – Raised when a gather operation other than “ALL” or “BATCH” is provided.
- Returns
A batch of completed jobs that is at minimum the given size.
- Return type
List[Job]
-
set_timeout
(timeout)# Set a timeout for the Evaluator. It will create task with a “time budget” and will kill the the task if this budget is exhausted.
-
submit
(configs: List[Dict])# Send configurations to be evaluated by available workers.
- Parameters
configs (List[Dict]) – A list of dict which will be passed to the run function to be executed.
-
to_json
()# Returns a json version of the evaluator.