论文标题

EEGNN:带有贝叶斯非参数图模型的边缘增强图形神经网络

EEGNN: Edge Enhanced Graph Neural Network with a Bayesian Nonparametric Graph Model

论文作者

Liu, Yirui, Qiao, Xinghao, Wang, Liying, Lam, Jessica

论文摘要

训练深图神经网络(GNNS)构成了一项具有挑战性的任务,因为GNN的性能可能会遭受隐藏的消息层的数量。文献集中在{过度平滑}和{降低}的建议上,以解释深度GNN的性能恶化。在本文中,我们为这种恶化的性能现象提出了一个新的解释,即{miS simplification},也就是说,通过防止自行宽图和强迫边缘未加权来简化图形。我们表明,这种简化可以降低通过消息层捕获图形的结构信息的潜力。鉴于此,我们提出了一个新的框架,Edge增强了图形神经网络(EEGNN)。 EEGNN使用从提出的Dirichlet混合泊松图模型(DMPGM)(贝叶斯非参数模型)中提取的结构信息,以提高各种深度消息串起的GNN的性能。我们为DMPGM提出了马尔可夫链蒙特卡洛推理框架。不同数据集的实验表明,与基线相比,我们的方法实现了可观的性能。

Training deep graph neural networks (GNNs) poses a challenging task, as the performance of GNNs may suffer from the number of hidden message-passing layers. The literature has focused on the proposals of {over-smoothing} and {under-reaching} to explain the performance deterioration of deep GNNs. In this paper, we propose a new explanation for such deteriorated performance phenomenon, {mis-simplification}, that is, mistakenly simplifying graphs by preventing self-loops and forcing edges to be unweighted. We show that such simplifying can reduce the potential of message-passing layers to capture the structural information of graphs. In view of this, we propose a new framework, edge enhanced graph neural network (EEGNN). EEGNN uses the structural information extracted from the proposed Dirichlet mixture Poisson graph model (DMPGM), a Bayesian nonparametric model for graphs, to improve the performance of various deep message-passing GNNs. We propose a Markov chain Monte Carlo inference framework for DMPGM. Experiments over different datasets show that our method achieves considerable performance increase compared to baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源