:py:mod:`cotengra.interface` ============================ .. py:module:: cotengra.interface .. autoapi-nested-parse:: High-level interface functions to cotengra. Module Contents --------------- Classes ~~~~~~~ .. autoapisummary:: cotengra.interface.Variadic cotengra.interface.Via cotengra.interface.WithBackend Functions ~~~~~~~~~ .. autoapisummary:: cotengra.interface.register_preset cotengra.interface.preset_to_optimizer cotengra.interface.can_hash_optimize cotengra.interface.identity cotengra.interface.hash_prepare_optimize cotengra.interface.hash_contraction cotengra.interface.normalize_input cotengra.interface._find_path_explicit_path cotengra.interface._find_path_optimizer cotengra.interface._find_path_preset cotengra.interface._find_path_tree cotengra.interface.find_path cotengra.interface.array_contract_path cotengra.interface._find_tree_explicit cotengra.interface._find_tree_optimizer_search cotengra.interface._find_tree_optimizer_basic cotengra.interface._find_tree_preset cotengra.interface._find_tree_tree cotengra.interface.find_tree cotengra.interface.array_contract_tree cotengra.interface._array_contract_expression_with_constants cotengra.interface._build_expression cotengra.interface.array_contract_expression cotengra.interface.array_contract cotengra.interface.einsum_tree cotengra.interface.einsum_expression cotengra.interface.einsum Attributes ~~~~~~~~~~ .. autoapisummary:: cotengra.interface._PRESETS cotengra.interface._COMPRESSED_PRESETS cotengra.interface._HASH_OPTIMIZE_PREPARERS cotengra.interface._find_path_handlers cotengra.interface._PATH_CACHE cotengra.interface._find_tree_handlers cotengra.interface._CONTRACT_EXPR_CACHE .. py:data:: _PRESETS .. py:data:: _COMPRESSED_PRESETS .. py:function:: register_preset(preset, optimizer, register_opt_einsum='auto', compressed=False) Register a preset optimizer. .. py:function:: preset_to_optimizer(preset) .. py:function:: can_hash_optimize(cls) Check if the type of `optimize` supplied can be hashed. .. py:function:: identity(x) .. py:data:: _HASH_OPTIMIZE_PREPARERS .. py:function:: hash_prepare_optimize(optimize) Transform an `optimize` object into a hashable form. .. py:function:: hash_contraction(inputs, output, size_dict, optimize, **kwargs) Compute a hash key for the specified contraction. .. py:function:: normalize_input(inputs, output=None, size_dict=None, shapes=None, canonicalize=True) Parse a contraction definition, optionally canonicalizing the indices (mapping them into symbols beginning with ``'a', 'b', 'c', ...``), computing the output if not specified, and computing the ``size_dict`` from .. py:data:: _find_path_handlers .. py:function:: _find_path_explicit_path(inputs, output, size_dict, optimize) .. py:function:: _find_path_optimizer(inputs, output, size_dict, optimize, **kwargs) .. py:function:: _find_path_preset(inputs, output, size_dict, optimize, **kwargs) .. py:function:: _find_path_tree(inputs, output, size_dict, optimize, **kwargs) .. py:function:: find_path(inputs, output, size_dict, optimize='auto', **kwargs) Directly find a contraction path for a given set of inputs and output. :param inputs: The inputs terms. :type inputs: Sequence[Sequence[str]] :param output: The output term. :type output: Sequence[str] :param size_dict: The size of each index. :type size_dict: dict[str, int] :param optimize: The optimization strategy to use. This can be: * A string preset, e.g. ``'auto'``, ``'greedy'``, ``'optimal'``. * A ``PathOptimizer`` instance. * An explicit path, e.g. ``[(0, 1), (2, 3), ...]``. * An explicit ``ContractionTree`` instance. :type optimize: str, path_like, PathOptimizer, or ContractionTree :returns: **path** -- The contraction path. :rtype: tuple[tuple[int]] .. py:data:: _PATH_CACHE .. py:function:: array_contract_path(inputs, output=None, size_dict=None, shapes=None, optimize='auto', canonicalize=True, cache=True) Find only the contraction path for the specific contraction, with fast dispatch of ``optimize``, which can be a preset, path, tree, cotengra optimizer or opt_einsum optimizer. The raw path is a more compact representation of the core tree structure but contains less information on its own, for example sliced indices are not included. :param inputs: The inputs terms. :type inputs: Sequence[Sequence[Hashable]] :param output: The output term. :type output: Sequence[Hashable], optional :param size_dict: The size of each index, if given, ``shapes`` is ignored. :type size_dict: dict[Hashable, int], optional :param shapes: The shape of each input array. Needed if ``size_dict`` not supplied. :type shapes: Sequence[tuple[int]], optional :param optimize: The optimization strategy to use. This can be: - A string preset, e.g. ``'auto'``, ``'greedy'``, ``'optimal'``. - A ``PathOptimizer`` instance. - An explicit path, e.g. ``[(0, 1), (2, 3), ...]``. - An explicit ``ContractionTree`` instance. :type optimize: str, path_like, PathOptimizer, or ContractionTree :param canonicalize: If ``True``, canonicalize the inputs and output so that the indices are relabelled ``'a', 'b', 'c', ...``, etc. in the order they appear. :type canonicalize: bool, optional :param cache: If ``True``, cache the path for the contraction, so that if the same pathfinding is performed multiple times the overhead is negated. Only for hashable ``optimize`` objects. :type cache: bool, optional :returns: **path** -- The contraction path, whose interpretation is thus: the input tensors are assumed to be stored in a list, i.e. indexed by ``range(N)``. Each contraction in the path is a set of indices, the tensors at these locations should be *popped* from the list and then the result of the contraction *appended*. :rtype: tuple[tuple[int]] .. py:function:: _find_tree_explicit(inputs, output, size_dict, optimize) .. py:function:: _find_tree_optimizer_search(inputs, output, size_dict, optimize, **kwargs) .. py:function:: _find_tree_optimizer_basic(inputs, output, size_dict, optimize, **kwargs) .. py:function:: _find_tree_preset(inputs, output, size_dict, optimize, **kwargs) .. py:function:: _find_tree_tree(inputs, output, size_dict, optimize, **kwargs) .. py:data:: _find_tree_handlers .. py:function:: find_tree(inputs, output, size_dict, optimize='auto', **kwargs) Find a contraction tree for the specific contraction, with fast dispatch of ``optimize``, which can be a preset, path, tree, cotengra optimizer or opt_einsum optimizer. :param inputs: The inputs terms. :type inputs: Sequence[Sequence[str]] :param output: The output term. :type output: Sequence[str] :param size_dict: The size of each index. :type size_dict: dict[str, int] :param optimize: The optimization strategy to use. This can be: - A string preset, e.g. ``'auto'``, ``'greedy'``, ``'optimal'``. - A ``PathOptimizer`` instance. - An explicit path, e.g. ``[(0, 1), (2, 3), ...]``. - An explicit ``ContractionTree`` instance. :type optimize: str, path_like, PathOptimizer, or ContractionTree :returns: **tree** :rtype: ContractionTree .. py:function:: array_contract_tree(inputs, output=None, size_dict=None, shapes=None, optimize='auto', canonicalize=True, sort_contraction_indices=False) Get the ``ContractionTree`` for the tensor contraction specified by ``inputs``, ``output`` and ``size_dict``, with optimization strategy given by ``optimize``. The tree can be used to inspect and also perform the contraction. :param inputs: The inputs terms. :type inputs: Sequence[Sequence[Hashable]] :param output: The output term. :type output: Sequence[Hashable], optional :param size_dict: The size of each index, if given, ``shapes`` is ignored. :type size_dict: dict[Hashable, int], optional :param shapes: The shape of each input array. Needed if ``size_dict`` not supplied. :type shapes: Sequence[tuple[int]], optional :param optimize: The optimization strategy to use. This can be: - A string preset, e.g. ``'auto'``, ``'greedy'``, ``'optimal'``. - A ``PathOptimizer`` instance. - An explicit path, e.g. ``[(0, 1), (2, 3), ...]``. - An explicit ``ContractionTree`` instance. :type optimize: str, path_like, PathOptimizer, or ContractionTree :param canonicalize: If ``True``, canonicalize the inputs and output so that the indices are relabelled ``'a', 'b', 'c', ...``, etc. in the order they appear. :type canonicalize: bool, optional :param sort_contraction_indices: If ``True``, call ``tree.sort_contraction_indices()``. :type sort_contraction_indices: bool, optional :rtype: ContractionTree .. seealso:: :obj:`array_contract`, :obj:`array_contract_expression`, :obj:`einsum_tree` .. py:class:: Variadic(fn, **kwargs) Wrapper to make non-variadic (i.e. with signature ``f(arrays)``) function variadic (i.e. with signature ``f(*arrays)``). .. py:attribute:: __slots__ :value: ('fn', 'kwargs') .. py:method:: __call__(*arrays, **kwargs) .. py:class:: Via(fn, convert_in, convert_out) Wrapper that applies one function to the input arrays and another to the output array. For example, moving the tensors from CPU to GPU and back. .. py:attribute:: __slots__ :value: ('fn', 'convert_in', 'convert_out') .. py:method:: __call__(*arrays, **kwargs) .. py:class:: WithBackend(fn) Wrapper to make any autoray written function take a ``backend`` kwarg, by simply using `autoray.backend_like`. .. py:attribute:: __slots__ :value: ('fn',) .. py:method:: __call__(*args, backend=None, **kwargs) .. py:function:: _array_contract_expression_with_constants(inputs, output, size_dict, constants, optimize='auto', implementation=None, prefer_einsum=False, autojit=False, via=None, sort_contraction_indices=False, cache=True) .. py:function:: _build_expression(inputs, output=None, size_dict=None, optimize='auto', implementation=None, prefer_einsum=False, autojit=False, via=None, sort_contraction_indices=False) .. py:data:: _CONTRACT_EXPR_CACHE .. py:function:: array_contract_expression(inputs, output=None, size_dict=None, shapes=None, optimize='auto', constants=None, canonicalize=True, cache=True, **kwargs) Get an callable 'expression' that will contract tensors with indices and shapes described by ``inputs`` and ``size_dict`` to ``output``. The ``optimize`` kwarg can be a path, optimizer or also a contraction tree. In the latter case sliced indices for example will be used if present. The same is true if ``optimize`` is an optimizer that can directly produce ``ContractionTree`` instances (i.e. has a ``.search()`` method). :param inputs: The inputs terms. :type inputs: Sequence[Sequence[Hashable]] :param output: The output term. :type output: Sequence[Hashable] :param size_dict: The size of each index. :type size_dict: dict[Hashable, int] :param optimize: The optimization strategy to use. This can be: - A string preset, e.g. ``'auto'``, ``'greedy'``, ``'optimal'``. - A ``PathOptimizer`` instance. - An explicit path, e.g. ``[(0, 1), (2, 3), ...]``. - An explicit ``ContractionTree`` instance. If the optimizer provides sliced indices they will be used. :type optimize: str, path_like, PathOptimizer, or ContractionTree :param constants: A mapping of constant input positions to constant arrays. If given, the final expression will take only the remaining non-constant tensors as inputs. Note this is a different format to the ``constants`` kwarg of :func:`einsum_expression` since it also provides the constant arrays. :type constants: dict[int, array_like], optional :param implementation: What library to use to actually perform the contractions. Options are: - None: let cotengra choose. - "autoray": dispatch with autoray, using the ``tensordot`` and ``einsum`` implementation of the backend. - "cotengra": use the ``tensordot`` and ``einsum`` implementation of cotengra, which is based on batch matrix multiplication. This is faster for some backends like numpy, and also enables libraries which don't yet provide ``tensordot`` and ``einsum`` to be used. - "cuquantum": use the cuquantum library to perform the whole contraction (not just individual contractions). - tuple[callable, callable]: manually supply the ``tensordot`` and ``einsum`` implementations to use. :type implementation: str or tuple[callable, callable], optional :param autojit: If ``True``, use :func:`autoray.autojit` to compile the contraction function. :type autojit: bool, optional :param via: If given, the first function will be applied to the input arrays and the second to the output array. For example, moving the tensors from CPU to GPU and back. :type via: tuple[callable, callable], optional :param sort_contraction_indices: If ``True``, call ``tree.sort_contraction_indices()`` before constructing the contraction function. :type sort_contraction_indices: bool, optional :param cache: If ``True``, cache the contraction expression. This negates the overhead of pathfinding and building the expression when a contraction is performed multiple times. Only for hashable ``optimize`` objects. :type cache: bool, optional :returns: **expr** -- A callable, signature ``expr(*arrays)`` that will contract ``arrays`` with shapes ``shapes``. :rtype: callable .. seealso:: :obj:`einsum_expression`, :obj:`array_contract`, :obj:`array_contract_tree` .. py:function:: array_contract(arrays, inputs, output=None, optimize='auto', cache_expression=True, backend=None, **kwargs) Perform the tensor contraction specified by ``inputs``, ``output`` and ``size_dict``, using strategy given by ``optimize``. By default the path finding and expression building is cached, so that if the a matching contraction is performed multiple times the overhead is negated. :param arrays: The arrays to contract. :type arrays: Sequence[array_like] :param inputs: The inputs terms. :type inputs: Sequence[Sequence[Hashable]] :param output: The output term. :type output: Sequence[Hashable] :param optimize: The optimization strategy to use. This can be: - A string preset, e.g. ``'auto'``, ``'greedy'``, ``'optimal'``. - A ``PathOptimizer`` instance. - An explicit path, e.g. ``[(0, 1), (2, 3), ...]``. - An explicit ``ContractionTree`` instance. If the optimizer provides sliced indices they will be used. :type optimize: str, path_like, PathOptimizer, or ContractionTree :param cache_expression: If ``True``, cache the expression used to contract the arrays. This negates the overhead of pathfinding and building the expression when a contraction is performed multiple times. Only for hashable ``optimize`` objects. :type cache_expression: bool, optional :param backend: If given, the explicit backend to use for the contraction, by default the backend is dispatched automatically. :type backend: str, optional :param kwargs: Passed to :func:`~cotengra.interface.array_contract_expression`. :rtype: array_like .. seealso:: :obj:`array_contract_expression`, :obj:`array_contract_tree`, :obj:`einsum` .. py:function:: einsum_tree(*args, optimize='auto', canonicalize=False, sort_contraction_indices=False) Get the `ContractionTree` for the einsum equation ``eq`` and optimization strategy ``optimize``. The tree can be used to inspect and also perform the contraction. :param eq: The equation to use for contraction, for example ``'ab,bc->ac'``. :type eq: str :param shapes: The shape of each input array. :type shapes: Sequence[tuple[int]] :param optimize: The optimization strategy to use. This can be: - A string preset, e.g. ``'auto'``, ``'greedy'``, ``'optimal'``. - A ``PathOptimizer`` instance. - An explicit path, e.g. ``[(0, 1), (2, 3), ...]``. - An explicit ``ContractionTree`` instance. :type optimize: str, path_like, PathOptimizer, or ContractionTree :param canonicalize: If ``True``, canonicalize the inputs and output so that the indices are relabelled ``'a', 'b', 'c', ...``, etc. in the order they appear. :type canonicalize: bool, optional :param sort_contraction_indices: If ``True``, call ``tree.sort_contraction_indices()``. :type sort_contraction_indices: bool, optional :rtype: ContractionTree .. seealso:: :obj:`einsum`, :obj:`einsum_expression`, :obj:`array_contract_tree` .. py:function:: einsum_expression(*args, optimize='auto', constants=None, cache=True, **kwargs) Get an callable 'expression' that will contract tensors with shapes ``shapes`` according to equation ``eq``. The ``optimize`` kwarg can be a path, optimizer or also a contraction tree. In the latter case sliced indices for example will be used if present. The same is true if ``optimize`` is an optimizer that can directly produce ``ContractionTree`` instances (i.e. has a ``.search()`` method). :param eq: The equation to use for contraction, for example ``'ab,bc->ac'``. The output will be automatically computed if not supplied, but Ellipses (`'...'`) are not supported. :type eq: str :param shapes: The shapes of the tensors to contract, or the constant tensor itself if marked as constant in ``constants``. :type shapes: Sequence[tuple[int]] :param optimize: The optimization strategy to use. This can be: - A string preset, e.g. ``'auto'``, ``'greedy'``, ``'optimal'``. - A ``PathOptimizer`` instance. - An explicit path, e.g. ``[(0, 1), (2, 3), ...]``. - An explicit ``ContractionTree`` instance. If the optimizer provides sliced indices they will be used. :type optimize: str, path_like, PathOptimizer, or ContractionTree :param constants: The indices of tensors to treat as constant, the final expression will take the remaining non-constant tensors as inputs. Note this is a different format to the ``constants`` kwarg of :func:`array_contract_expression` since the actual constant arrays are inserted into ``shapes``. :type constants: Sequence of int, optional :param implementation: What library to use to actually perform the contractions. Options are: - None: let cotengra choose. - "autoray": dispatch with autoray, using the ``tensordot`` and ``einsum`` implementation of the backend. - "cotengra": use the ``tensordot`` and ``einsum`` implementation of cotengra, which is based on batch matrix multiplication. This is faster for some backends like numpy, and also enables libraries which don't yet provide ``tensordot`` and ``einsum`` to be used. - "cuquantum": use the cuquantum library to perform the whole contraction (not just individual contractions). - tuple[callable, callable]: manually supply the ``tensordot`` and ``einsum`` implementations to use. :type implementation: str or tuple[callable, callable], optional :param autojit: If ``True``, use :func:`autoray.autojit` to compile the contraction function. :type autojit: bool, optional :param via: If given, the first function will be applied to the input arrays and the second to the output array. For example, moving the tensors from CPU to GPU and back. :type via: tuple[callable, callable], optional :param sort_contraction_indices: If ``True``, call ``tree.sort_contraction_indices()`` before constructing the contraction function. :type sort_contraction_indices: bool, optional :param cache: If ``True``, cache the contraction expression. This negates the overhead of pathfinding and building the expression when a contraction is performed multiple times. Only for hashable ``optimize`` objects. :type cache: bool, optional :returns: **expr** -- A callable, signature ``expr(*arrays)`` that will contract ``arrays`` with shapes matching ``shapes``. :rtype: callable .. seealso:: :obj:`einsum`, :obj:`einsum_tree`, :obj:`array_contract_expression` .. py:function:: einsum(*args, optimize='auto', cache_expression=True, backend=None, **kwargs) Perform an einsum contraction, using `cotengra`, using strategy given by ``optimize``. By default the path finding and expression building is cached, so that if a matching contraction is performed multiple times the overhead is negated. :param eq: The equation to use for contraction, for example ``'ab,bc->ac'``. :type eq: str :param arrays: The arrays to contract. :type arrays: Sequence[array_like] :param optimize: The optimization strategy to use. This can be: - A string preset, e.g. ``'auto'``, ``'greedy'``, ``'optimal'``. - A ``PathOptimizer`` instance. - An explicit path, e.g. ``[(0, 1), (2, 3), ...]``. - An explicit ``ContractionTree`` instance. If the optimizer provides sliced indices they will be used. :type optimize: str, path_like, PathOptimizer, or ContractionTree :param cache_expression: If ``True``, cache the expression used to contract the arrays. This negates the overhead of pathfinding and building the expression when a contraction is performed multiple times. Only for hashable ``optimize`` objects. :type cache_expression: bool, optional :param backend: If given, the explicit backend to use for the contraction, by default the backend is dispatched automatically. :type backend: str, optional :param kwargs: Passed to :func:`~cotengra.interface.array_contract_expression`. :rtype: array_like .. seealso:: :obj:`einsum_expression`, :obj:`einsum_tree`, :obj:`array_contract`