cotengra.reusable

Classes

ReusableOptimizer

Mixin class for optimizers that can be reused, caching the paths

Functions

sortedtuple(x)

make_hashable(x)

Make x hashable by recursively turning list into tuples and dicts

hash_contraction_a(inputs, output, size_dict)

hash_contraction_b(inputs, output, size_dict)

hash_contraction(inputs, output, size_dict[, method])

Compute a hash for a particular contraction geometry.

Module Contents

cotengra.reusable.sortedtuple(x)[source]
cotengra.reusable.make_hashable(x)[source]

Make x hashable by recursively turning list into tuples and dicts into sorted tuples of key-value pairs.

cotengra.reusable.hash_contraction_a(inputs, output, size_dict)[source]
cotengra.reusable.hash_contraction_b(inputs, output, size_dict)[source]
cotengra.reusable.hash_contraction(inputs, output, size_dict, method='a')[source]

Compute a hash for a particular contraction geometry.

class cotengra.reusable.ReusableOptimizer(*, directory=None, overwrite=False, hash_method='a', cache_only=False, directory_split='auto', **opt_kwargs)[source]

Bases: cotengra.oe.PathOptimizer

Mixin class for optimizers that can be reused, caching the paths and other relevant information for reconstructing the full tree.

The following methods should be implemented in the subclass:

_get_path_relevant_opts(self) _get_suboptimizer(self) _deconstruct_tree(self, opt, tree) _reconstruct_tree(self, inputs, output, size_dict, con)

Parameters:
  • directory (None, True, or str, optional) – If specified use this directory as a persistent cache. If True auto generate a directory in the current working directory based on the options which are most likely to affect the path (see ReusableHyperOptimizer._get_path_relevant_opts).

  • overwrite (bool or 'improved', optional) – If True, the optimizer will always run, overwriting old results in the cache. This can be used to update paths without deleting the whole cache. If 'improved' then only overwrite if the new path is better.

  • hash_method ({'a', 'b', ...}, optional) – The method used to hash the contraction tree. The default, 'a', is faster hashwise but doesn’t recognize when indices are permuted.

  • cache_only (bool, optional) – If True, the optimizer will only use the cache, and will raise KeyError if a contraction is not found.

  • directory_split ("auto" or bool, optional) – If specified, the hash will be split into two parts, the first part will be used as a subdirectory, and the second part will be used as the filename. This is useful for avoiding a very large flat diretory. If “auto” it will check the current cache if any and guess from that.

  • opt_kwargs – Supplied to self._get_suboptimizer(self).

_suboptimizers
_suboptimizer_kwargs
_cache
overwrite = False
_hash_method = 'a'
cache_only = False
directory_split = 'auto'
property last_opt
abstractmethod _get_path_relevant_opts()[source]

We only want to hash on options that affect the contraction, not things like progbar.

auto_hash_path_relevant_opts()[source]

Automatically hash the path relevant options used to create the optimizer.

hash_query(inputs, output, size_dict)[source]

Hash the contraction specification, returning this and whether the contraction is already present as a tuple.

property minimize
update_from_tree(tree, overwrite='improved')[source]

Explicitly add the contraction that tree represents into the cache. For example, if you have manually improved it via reconfing. If overwrite=False and the contracton is present already then do nothing. If overwrite='improved' then only overwrite if the new path is better. If overwrite=True then always overwrite.

Parameters:
  • tree (ContractionTree) – The tree to add to the cache.

  • overwrite (bool or "improved", optional) – If True always overwrite, if False only overwrite if the contraction is missing, if 'improved' only overwrite if the new path is better (the default). Note that the comparison of scores is based on default objective of the tree.

_run_optimizer(inputs, output, size_dict)[source]
_maybe_run_optimizer(inputs, output, size_dict)[source]
__call__(inputs, output, size_dict, memory_limit=None)[source]
abstractmethod _get_suboptimizer()[source]
abstractmethod _deconstruct_tree(opt, tree)[source]
abstractmethod _reconstruct_tree(inputs, output, size_dict, con)[source]
search(inputs, output, size_dict)[source]
cleanup()[source]