论文标题
训练图神经网络在生长的随机图上
Training Graph Neural Networks on Growing Stochastic Graphs
论文作者
论文摘要
图形神经网络(GNNS)依靠图卷积来利用网络数据中有意义的模式。基于矩阵乘法,高度计算成本的卷积会导致实践中的可伸缩性限制。为了克服这些局限性,提出的方法依赖于较小数量的节点中的训练GNN,然后将GNN转移到较大的图表中。即使这些方法能够使用不同数量的节点来绑定GNN输出之间的差异,但它们也不能为非常大图上的最佳GNN提供保证。在本文中,我们建议通过利用生长图的序列(图形)的极限对象来学习非常大图的GNN。我们建议在训练时增加图形的大小,并表明我们提出的方法 - 通过转移学习 - 收敛到图形数据上一阶固定点的社区。数值实验验证了我们提出的方法。
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data. Based on matrix multiplications, convolutions incur in high computational costs leading to scalability limitations in practice. To overcome these limitations, proposed methods rely on training GNNs in smaller number of nodes, and then transferring the GNN to larger graphs. Even though these methods are able to bound the difference between the output of the GNN with different number of nodes, they do not provide guarantees against the optimal GNN on the very large graph. In this paper, we propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon. We propose to grow the size of the graph as we train, and we show that our proposed methodology -- learning by transference -- converges to a neighborhood of a first order stationary point on the graphon data. A numerical experiment validates our proposed approach.