论文标题

从图表到DAG:低复杂模型和可扩展算法

From graphs to DAGs: a low-complexity model and a scalable algorithm

论文作者

Dong, Shuyu, Sebag, Michèle

论文摘要

在概率和因果建模的核心方面,学习定向的无环图(DAGS)一直是一个关键的挑战。 (Zheng等,2018)通过涉及矩阵指数trace $ \ mathrm {tr}(\ exp(\ cdot))$的notears方法,为通过连续优化学习dags的方法开辟了一种方法,尽管具有$ o(d^3)的数字$ d $ d $ d $ d $ d $。本文提出了一个低复杂性模型,称为低级添加剂模型,称为LORAM,该模型将低级基质分解与稀疏机制相结合,以连续优化DAG。该方法的主要贡献在于利用模型的低级别属性的有效梯度近似方法,以及其直接应用对从图矩阵计算到DAG矩阵空间的投影。所提出的方法从立方复杂性到二次复杂性的降低,同时处理与宣传相同的DAG特征函数,并很容易地缩放到成千上万的节点,以解决投影问题。实验表明,与最先进的劳动相比,劳拉姆在考虑到稀疏矩阵的范围内以非常适度的精度损失,并且对模型低级组件的等级选择的灵敏度较低。

Learning directed acyclic graphs (DAGs) is long known a critical challenge at the core of probabilistic and causal modeling. The NoTears approach of (Zheng et al., 2018), through a differentiable function involving the matrix exponential trace $\mathrm{tr}(\exp(\cdot))$, opens up a way to learning DAGs via continuous optimization, though with a $O(d^3)$ complexity in the number $d$ of nodes. This paper presents a low-complexity model, called LoRAM for Low-Rank Additive Model, which combines low-rank matrix factorization with a sparsification mechanism for the continuous optimization of DAGs. The main contribution of the approach lies in an efficient gradient approximation method leveraging the low-rank property of the model, and its straightforward application to the computation of projections from graph matrices onto the DAG matrix space. The proposed method achieves a reduction from a cubic complexity to quadratic complexity while handling the same DAG characteristic function as NoTears, and scales easily up to thousands of nodes for the projection problem. The experiments show that the LoRAM achieves efficiency gains of orders of magnitude compared to the state-of-the-art at the expense of a very moderate accuracy loss in the considered range of sparse matrices, and with a low sensitivity to the rank choice of the model's low-rank component.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源