论文标题

通过神经网络和二阶图神经网络的步行消息

Walk Message Passing Neural Networks and Second-Order Graph Neural Networks

论文作者

Geerts, Floris

论文摘要

已知消息传递神经网络(MPNN)的表达能力与一维Weisfeiler-Leman图(1-WL)同构测试的表达能力相匹配。为了提高MPNN的表达能力,最近已经根据高维weisfeiler-Leman测试提出了许多图神经网络架构。在本文中,我们考虑了二维(2-WL)测试,并引入了一种新型的MPNN,称为$ \ ell $ -WALK MPNNS,该MPNNS沿着顶点之间的长度$ \ ell $汇总。我们表明,$ 2 $ walk mpnns的表达力量2-WL。更一般而言,对于任何$ \ ell \ geq 2 $,$ \ ell $ -walk mpnns均显示出与最近引入的$ \ ell $ walk修复程序的表达能力相匹配的(w [$ \ ell $])。基于2-WL和W [$ \ ell $]之间的对应关系,我们观察到$ \ ell $ -Walk mpnns和$ 2 $ -Walk mpnns具有相同的表达能力,即,它们可以区分相同的图形,但是$ \ ell $ -walk mpnns可能会区分$ 2 $ - 2 $ - 2 $ - 2 $ -2 $ -2。当涉及到符合2-WL或W [$ \ ELL $]表达能力的具体学习图神经网络(GNN)形式主义时,我们考虑允许非线性层的二阶图神经网络。特别是,要匹配表达能力的w [$ \ ell $],我们允许每一层中允许$ \ ell-1 $矩阵乘法。我们建议不同版本的二阶GNN,具体取决于功能类型(即来自可数集或来自无数集的集合),因为这会影响表示特征所需的维度数量。我们的结果表明,通过允许多个矩阵乘法来增加层中的非线性不会增加表达能力。最好的是,它可以更快地区分输入图。

The expressive power of message passing neural networks (MPNNs) is known to match the expressive power of the 1-dimensional Weisfeiler-Leman graph (1-WL) isomorphism test. To boost the expressive power of MPNNs, a number of graph neural network architectures have recently been proposed based on higher-dimensional Weisfeiler-Leman tests. In this paper we consider the two-dimensional (2-WL) test and introduce a new type of MPNNs, referred to as $\ell$-walk MPNNs, which aggregate features along walks of length $\ell$ between vertices. We show that $2$-walk MPNNs match 2-WL in expressive power. More generally, $\ell$-walk MPNNs, for any $\ell\geq 2$, are shown to match the expressive power of the recently introduced $\ell$-walk refinement procedure (W[$\ell$]). Based on a correspondence between 2-WL and W[$\ell$], we observe that $\ell$-walk MPNNs and $2$-walk MPNNs have the same expressive power, i.e., they can distinguish the same pairs of graphs, but $\ell$-walk MPNNs can possibly distinguish pairs of graphs faster than $2$-walk MPNNs. When it comes to concrete learnable graph neural network (GNN) formalisms that match 2-WL or W[$\ell$] in expressive power, we consider second-order graph neural networks that allow for non-linear layers. In particular, to match W[$\ell$] in expressive power, we allow $\ell-1$ matrix multiplications in each layer. We propose different versions of second-order GNNs depending on the type of features (i.e., coming from a countable set, or coming from an uncountable set) as this affects the number of dimensions needed to represent the features. Our results indicate that increasing non-linearity in layers by means of allowing multiple matrix multiplications does not increase expressive power. At the very best, it results in a faster distinction of input graphs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源