:py:mod:`cotengra.hyperoptimizers.hyper_baytune` ================================================ .. py:module:: cotengra.hyperoptimizers.hyper_baytune .. autoapi-nested-parse:: Hyper optimization using baytune. Module Contents --------------- Functions ~~~~~~~~~ .. autoapisummary:: cotengra.hyperoptimizers.hyper_baytune.convert_param_to_baytune cotengra.hyperoptimizers.hyper_baytune.baytune_init_optimizers cotengra.hyperoptimizers.hyper_baytune.baytune_get_setting cotengra.hyperoptimizers.hyper_baytune.baytune_report_result Attributes ~~~~~~~~~~ .. autoapisummary:: cotengra.hyperoptimizers.hyper_baytune.BTB_TYPE_TO_HYPERPARAM .. py:data:: BTB_TYPE_TO_HYPERPARAM .. py:function:: convert_param_to_baytune(param) Convert a search subspace to ``baytune`` form. .. py:function:: baytune_init_optimizers(self, methods, space, sampler='GP', method_sampler='UCB1', sampler_opts=None) Set-up the baytune optimizer(s). :param space: The search space. :type space: dict[str, dict[str, dict]] :param sampler: Which ``btb`` parameter fitter to use - default ``'GP'`` means gaussian process. Other options include ``'Uniform'`` and ``'GPEi'``. See https://hdi-project.github.io/BTB/api/btb.tuning.tuners.html. :type sampler: str, optional :param method_sampler: Which ``btb`` selector to use - default 'UCB1'. See https://hdi-project.github.io/BTB/api/btb.selection.html. :type method_sampler: str, optional :param sampler_opts: Options to supply to ``btb``. :type sampler_opts: dict, optional .. py:function:: baytune_get_setting(self) Get a setting to trial from one of the baytune optimizers. .. py:function:: baytune_report_result(self, setting, trial, score) Report the result of a trial to the baytune optimizers.