Skip to content Skip to footer

Improving Tensor Contraction Paths through a Refined Standard Greedy Algorithm with Enhanced Cost Function.

A team of researchers has developed a new method for improving tensor contraction paths (CPs), which are used to solve problems across numerous areas of research, including machine learning, graph problems, quantum circuits, and model counting. Their technique improves upon the standard greedy algorithm (SGA), incorporating an enhanced cost function that covers a larger range of problems.

Calculating the CPs helps minimize computational cost, affecting the time it takes for computing to complete. Previous methods for finding efficient CPs have used simulated annealing and a genetic algorithm, a graph decomposition method involving Line-Graph (LG) and Factor-Tree (FT) methods, as well as a combination of reinforcement learning and Graph Neural Networks. However, these approaches often struggled with larger networks or higher-rank tensors.

The new method proposed by the researchers uses a modified SGA paired with an enhanced cost function. The typical SGA finds the costs of pairwise contractions based only on the size of the input and output tensors. This new method, however, uses more information to determine the costs of pairwise contractions, providing different cost functions to accommodate a wider range of problems. Thus, efficiency improves significantly, with the method outperforming other greedy implementations and even hypergraph partitioning combined with a greedy layout in some instances.

To efficiently find CPs for a large number of tensors, the researchers utilized the SGA in opt_einsum. This process occurs in three phases: the computation of Hadamard products, which involves the elementwise multiplication of tensors with the same set of indexes; contraction of the remaining tensors by selecting the lowest-cost pair at each step; and computation of outer products by choosing the pair that results in the smallest possible input sizes.

The modified algorithm uses cost functions as parameters. This differs from the SGA, which uses only one cost function. In the new method, multiple cost functions are employed, and the most optimal function is then selected for making future CPs.

A series of tests were performed with the new method and its multiple-cost-functions approach. Results favored the new algorithm, with it performing well in the computation of CPs for a wide range of problems. The adjusted greedy algorithm proved its efficiency by outperforming both standard and random greedy algorithms by opt_einsum and the hypergraph partitioning method.

This proposed method sheds a promising light on the path to efficient tensor contraction computation. By modifying the standard greedy algorithm and using a multi-cost-function approach, researchers can create efficient CPs in less time and tackle complex problems.

Leave a comment

0.0/5