论文标题

通用深度GNN:从路径分解的角度重新思考GNN中的残留连接,以防止过度平滑

Universal Deep GNNs: Rethinking Residual Connection in GNNs from a Path Decomposition Perspective for Preventing the Over-smoothing

论文作者

Chen, Jie, Liu, Weiqi, Huang, Zhizhong, Gao, Junbin, Zhang, Junping, Pu, Jian

论文摘要

GNNS的性能随着过度平滑而变得更深。在防止过度光滑的所有尝试中,由于其简单性,残留的连接是有前途的方法之一。但是,最近的研究表明,残留连接的GNN仅略微降低了变性。 GNN中残留连接失败的原因仍然未知。在本文中,我们从新颖的路径分解的角度研究了GNN与残留连接的前进行为。我们发现,剩余连接路径的二项式分布的中值长度路径的递归聚集主导了输出表示,随着GNN的进度越来越深,导致过度平滑。纠缠的传播和重量矩阵会导致梯度平滑,并防止具有残留连接的GNN优化与身份映射。基于这些发现,我们提出了带有冷启动自适应残留连接(驱动器)和前馈模块的通用深GNN(UDGNN)框架。广泛的实验证明了我们的方法的有效性,该方法通过简单地堆叠标准GNN来实现非平滑杂质数据集的最新结果。

The performance of GNNs degrades as they become deeper due to the over-smoothing. Among all the attempts to prevent over-smoothing, residual connection is one of the promising methods due to its simplicity. However, recent studies have shown that GNNs with residual connections only slightly slow down the degeneration. The reason why residual connections fail in GNNs is still unknown. In this paper, we investigate the forward and backward behavior of GNNs with residual connections from a novel path decomposition perspective. We find that the recursive aggregation of the median length paths from the binomial distribution of residual connection paths dominates output representation, resulting in over-smoothing as GNNs go deeper. Entangled propagation and weight matrices cause gradient smoothing and prevent GNNs with residual connections from optimizing to the identity mapping. Based on these findings, we present a Universal Deep GNNs (UDGNN) framework with cold-start adaptive residual connections (DRIVE) and feedforward modules. Extensive experiments demonstrate the effectiveness of our method, which achieves state-of-the-art results over non-smooth heterophily datasets by simply stacking standard GNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源