论文标题

实用的,逐渐表现的GNN

A Practical, Progressively-Expressive GNN

论文作者

Zhao, Lingxiao, Härtel, Louis, Shah, Neil, Akoglu, Leman

论文摘要

近年来,消息传递神经网络(MPNN)已成为图形神经网络(GNN)的主要风味。然而,MPNN具有显着的局限性。也就是说,在区分图同构测试框架中的图中,它们最多与一维Weisfeiler-Leman(1-WL)测试一样强大。为此,研究人员从K-WL层次结构中汲取了灵感来发展更具表现力的GNN。但是,当前的k-wl当量GNN对于k的较小值也不是实际的,因为随着k的成长,k-wl在组合上变得更加复杂。同时,几项作品在没有高度表达模型的情况下发现了在图形学习任务中取得巨大的经验成功,这意味着在实际任务中通常不需要像K-WL这样的粗糙粒度表达统治者来追逐表情。为了真正理解表现力复杂性的权衡,人们希望有一个更精细的统治者,这可以逐渐提高表现力。我们的工作提出了这样的建议:即,我们首先提出(k,c)(<=) - setWl层次结构,其复杂性大大降低了,从k-wl中,通过从k-tuples转移到具有<= k节点的<= k节点的<= k节点来实现的<= k节点,而定义了<= c连接的原始图。我们显示了与K-WL相关的该模型的理论结果,并通过(k,c)(<=) - setGnn对其进行具体化,该结果与(k,c)(<=) - setWl一样表现出。我们的模型是实用的且逐渐表达的,可以通过K和C提高功率。我们在几个基准数据集上展示了有效性,并通过适用于实用图的运行时和内存使用量获得了几个最先进的结果。我们在https://github.com/lingxiaoshawn/kcsetgnn上开源实施。

Message passing neural networks (MPNNs) have become a dominant flavor of graph neural networks (GNNs) in recent years. Yet, MPNNs come with notable limitations; namely, they are at most as powerful as the 1-dimensional Weisfeiler-Leman (1-WL) test in distinguishing graphs in a graph isomorphism testing frame-work. To this end, researchers have drawn inspiration from the k-WL hierarchy to develop more expressive GNNs. However, current k-WL-equivalent GNNs are not practical for even small values of k, as k-WL becomes combinatorially more complex as k grows. At the same time, several works have found great empirical success in graph learning tasks without highly expressive models, implying that chasing expressiveness with a coarse-grained ruler of expressivity like k-WL is often unneeded in practical tasks. To truly understand the expressiveness-complexity tradeoff, one desires a more fine-grained ruler, which can more gradually increase expressiveness. Our work puts forth such a proposal: Namely, we first propose the (k, c)(<=)-SETWL hierarchy with greatly reduced complexity from k-WL, achieved by moving from k-tuples of nodes to sets with <=k nodes defined over <=c connected components in the induced original graph. We show favorable theoretical results for this model in relation to k-WL, and concretize it via (k, c)(<=)-SETGNN, which is as expressive as (k, c)(<=)-SETWL. Our model is practical and progressively-expressive, increasing in power with k and c. We demonstrate effectiveness on several benchmark datasets, achieving several state-of-the-art results with runtime and memory usage applicable to practical graphs. We open source our implementation at https://github.com/LingxiaoShawn/KCSetGNN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源