论文标题

EIGNN:有效的无限深度图神经网络

EIGNN: Efficient Infinite-Depth Graph Neural Networks

论文作者

Liu, Juncheng, Kawaguchi, Kenji, Hooi, Bryan, Wang, Yiwei, Xiao, Xiaokui

论文摘要

图神经网络(GNN)广泛用于在许多应用程序中建模图形结构化数据。但是,凭借其固有有限的聚合层,现有的GNN模型可能无法有效地捕获基础图中的长距离依赖性。在这种限制的推动下,我们提出了一个具有无限深度的GNN模型,我们称之为有效的无限深度图神经网络(EIGNN),以有效地捕获非常长的依赖性。从理论上讲,我们得出了Eignn的封闭式解决方案,这使得训练是无限的GNN模型。然后,我们进一步表明,我们可以通过使用特征分类来实现更有效的计算来训练EIGNN。关于合成和现实世界数据集的综合实验的经验结果表明,Eignn比最近的基线具有更好的捕获远程依赖性的能力,并且始终如一地实现最新的性能。此外,我们证明我们的模型在节点特征上的噪声和对抗性扰动方面也更加强大。

Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications. However, with their inherently finite aggregation layers, existing GNN models may not be able to effectively capture long-range dependencies in the underlying graphs. Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN), to efficiently capture very long-range dependencies. We theoretically derive a closed-form solution of EIGNN which makes training an infinite-depth GNN model tractable. We then further show that we can achieve more efficient computation for training EIGNN by using eigendecomposition. The empirical results of comprehensive experiments on synthetic and real-world datasets show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance. Furthermore, we show that our model is also more robust against both noise and adversarial perturbations on node features.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源