论文标题

指数级改善使用图神经网络模拟Weisfeiler-Lehman测试的复杂性

Exponentially Improving the Complexity of Simulating the Weisfeiler-Lehman Test with Graph Neural Networks

论文作者

Aamand, Anders, Chen, Justin Y., Indyk, Piotr, Narayanan, Shyam, Rubinfeld, Ronitt, Schiefer, Nicholas, Silwal, Sandeep, Wagner, Tal

论文摘要

最近的工作表明,图形神经网络(GNN)在区分非同构图中的表达能力与Weisfeiler-Lehman(WL)图测试的表达能力完全相同。特别是,它们表明可以通过GNN模拟WL测试。但是,这些模拟涉及大小多项式的“组合”功能的神经网络,甚至在图形节点$ n $的数量中,以及$ n $中长度线性的功能向量。 我们提出了对\ emph {at -Emph}降低复杂性的GNN上WL测试的改进模拟。特别是,在每个节点中实现联合函数的神经网络在$ n $中仅具有一个聚类数量的参数,而GNN节点交换的功能向量仅由$ o(\ log n)$ bits组成。我们还为特征矢量长度和神经网络的大小提供了对数下限,显示了我们构造的(接近)最佳性。

Recent work shows that the expressive power of Graph Neural Networks (GNNs) in distinguishing non-isomorphic graphs is exactly the same as that of the Weisfeiler-Lehman (WL) graph test. In particular, they show that the WL test can be simulated by GNNs. However, those simulations involve neural networks for the 'combine' function of size polynomial or even exponential in the number of graph nodes $n$, as well as feature vectors of length linear in $n$. We present an improved simulation of the WL test on GNNs with \emph{exponentially} lower complexity. In particular, the neural network implementing the combine function in each node has only a polylogarithmic number of parameters in $n$, and the feature vectors exchanged by the nodes of GNN consists of only $O(\log n)$ bits. We also give logarithmic lower bounds for the feature vector length and the size of the neural networks, showing the (near)-optimality of our construction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源