论文标题

编译结构张量代数

Compiling Structured Tensor Algebra

论文作者

Ghorbani, Mahdi, Huot, Mathieu, Hashemian, Shideh, Shaikhha, Amir

论文摘要

张量代数对于各种计算域中的数据密集型工作负载至关重要。计算科学家面临着密度张量代数提供的专业化程度与利用稀疏张量提供的结构的算法效率之间的权衡。本文介绍了结构图,该框架在编译时象征性地计算结构。这是通过结构化张量统一表示(Stur)来启用的,统一表示(Stur)可以捕获张量计算以及它们的稀疏性和冗余结构。通过对无损张量计算的数学观点,我们表明我们的符号结构计算和相关的优化是正确的。最后,对于不同的张量计算工作负载和结构,我们通过实验表明捕获符号结构如何使密度和稀疏张量代数的最先进框架表现出色。

Tensor algebra is essential for data-intensive workloads in various computational domains. Computational scientists face a trade-off between the specialization degree provided by dense tensor algebra and the algorithmic efficiency that leverages the structure provided by sparse tensors. This paper presents StructTensor, a framework that symbolically computes structure at compilation time. This is enabled by Structured Tensor Unified Representation (STUR), an intermediate language that can capture tensor computations as well as their sparsity and redundancy structures. Through a mathematical view of lossless tensor computations, we show that our symbolic structure computation and the related optimizations are sound. Finally, for different tensor computation workloads and structures, we experimentally show how capturing the symbolic structure can result in outperforming state-of-the-art frameworks for both dense and sparse tensor algebra.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源