论文标题

Deepergcn:您需要训练更深的GCN

DeeperGCN: All You Need to Train Deeper GCNs

论文作者

Li, Guohao, Xiong, Chenxin, Thabet, Ali, Ghanem, Bernard

论文摘要

图形卷积网络(GCN)一直在图形上的表示能力中引起了极大的关注。与能够利用非常深层堆叠的卷积神经网络(CNN)不同,GCN在更深层次的梯度,过度平滑和过度拟合的问题上遭受了消失。这些挑战限制了GCN在大规模图上的表示能力。本文提出了能够成功训练的DeeperGCN。我们定义了可区分的广义聚合函数,以统一不同的消息聚合操作(例如均值,最大值)。我们还提出了一个新型的归一化层,即MSGNORM和GCN的残留连接的预激活版本。在开放图基准(OGB)上进行的广泛实验表明,DeeperGCN显着提高了节点属性预测和图形属性预测的大规模图学习任务的最新性能。有关更多信息,请访问https://www.deepgcns.org。

Graph Convolutional Networks (GCNs) have been drawing significant attention with the power of representation learning on graphs. Unlike Convolutional Neural Networks (CNNs), which are able to take advantage of stacking very deep layers, GCNs suffer from vanishing gradient, over-smoothing and over-fitting issues when going deeper. These challenges limit the representation power of GCNs on large-scale graphs. This paper proposes DeeperGCN that is capable of successfully and reliably training very deep GCNs. We define differentiable generalized aggregation functions to unify different message aggregation operations (e.g. mean, max). We also propose a novel normalization layer namely MsgNorm and a pre-activation version of residual connections for GCNs. Extensive experiments on Open Graph Benchmark (OGB) show DeeperGCN significantly boosts performance over the state-of-the-art on the large scale graph learning tasks of node property prediction and graph property prediction. Please visit https://www.deepgcns.org for more information.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源