:py:mod:`cotengra.presets` ========================== .. py:module:: cotengra.presets .. autoapi-nested-parse:: Preset configured optimizers. Module Contents --------------- Classes ~~~~~~~ .. autoapisummary:: cotengra.presets.AutoOptimizer cotengra.presets.AutoHQOptimizer Functions ~~~~~~~~~ .. autoapisummary:: cotengra.presets.estimate_optimal_hardness Attributes ~~~~~~~~~~ .. autoapisummary:: cotengra.presets.auto_optimize cotengra.presets.auto_hq_optimize cotengra.presets.greedy_optimize cotengra.presets.optimal_optimize cotengra.presets.optimal_outer_optimize .. py:function:: estimate_optimal_hardness(inputs) Provides a very rough estimate of how long it would take to find the optimal contraction order for a given set of inputs. The runtime is *very* approximately exponential in this number: .. math:: T \propto \exp {n^2 * k^0.5} Where :math:`n` is the number of tensors and :math:`k` is the average degree of the hypergraph. .. py:class:: AutoOptimizer(optimal_cutoff=250, minimize='combo', cache=True, **hyperoptimizer_kwargs) Bases: :py:obj:`cotengra.oe.PathOptimizer` An optimizer that automatically chooses between optimal and hyper-optimization, designed for everyday use. .. py:method:: _get_optimizer_hyper_threadsafe() .. py:method:: search(inputs, output, size_dict, **kwargs) .. py:method:: __call__(inputs, output, size_dict, **kwargs) .. py:class:: AutoHQOptimizer(**kwargs) Bases: :py:obj:`AutoOptimizer` An optimizer that automatically chooses between optimal and hyper-optimization, designed for everyday use on harder contractions or those that will be repeated many times, and thus warrant a more extensive search. .. py:data:: auto_optimize .. py:data:: auto_hq_optimize .. py:data:: greedy_optimize .. py:data:: optimal_optimize .. py:data:: optimal_outer_optimize