论文标题

从优化的角度重新访问半监督节点分类的图形卷积网络

Revisiting Graph Convolutional Network on Semi-Supervised Node Classification from an Optimization Perspective

论文作者

Zhang, Hongwei, Yan, Tijin, Xie, Zenjun, Xia, Yuanqing, Zhang, Yuan

论文摘要

图形卷积网络(GCN)已在基于图形的各种任务上实现了有希望的性能。但是,在堆叠更多的层时,它们会遭受过度平滑的影响。在本文中,我们介绍了有关该观察结果的定量研究,并向更深的GCN发展了新的见解。首先,我们从优化的角度解释了当前的图形卷积操作,并认为过度平滑是主要是由对优化问题的固定型解决方案的幼稚的一阶近似引起的。随后,我们介绍了两个指标,以测量节点级任务的过度光滑。具体而言,我们分别计算连接和断开的节点与整体距离之间的成对距离的分数。基于我们的理论和经验分析,我们从优化的角度建立了GCN的通用理论框架,并得出了一种名为GCN+的新型卷积内核,该质量具有较低的参数量,同时固有地缓解了过度光滑的量。对现实世界数据集的广泛实验证明了GCN+在节点分类任务上的最先进基线方法的卓越性能。

Graph convolutional networks (GCNs) have achieved promising performance on various graph-based tasks. However they suffer from over-smoothing when stacking more layers. In this paper, we present a quantitative study on this observation and develop novel insights towards the deeper GCN. First, we interpret the current graph convolutional operations from an optimization perspective and argue that over-smoothing is mainly caused by the naive first-order approximation of the solution to the optimization problem. Subsequently, we introduce two metrics to measure the over-smoothing on node-level tasks. Specifically, we calculate the fraction of the pairwise distance between connected and disconnected nodes to the overall distance respectively. Based on our theoretical and empirical analysis, we establish a universal theoretical framework of GCN from an optimization perspective and derive a novel convolutional kernel named GCN+ which has lower parameter amount while relieving the over-smoothing inherently. Extensive experiments on real-world datasets demonstrate the superior performance of GCN+ over state-of-the-art baseline methods on the node classification tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源