论文标题

无限宽的图形卷积网络:通过高斯过程的半监督学习

Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning via Gaussian Processes

论文作者

Hu, Jilin, Shen, Jianbing, Yang, Bin, Shao, Ling

论文摘要

Graph卷积神经网络〜(GCN)最近在基于图形的半监视分类上证明了有希望的结果,但是几乎没有完成探索其理论属性的工作。最近,事实证明,有几个深连接和卷积神经网络,具有无限隐藏单元的几个深层神经网络相当于高斯过程〜(GPS)。为了利用GCN的强大代表能力和GP的巨大表达能力,我们研究了无限宽GCN的类似特性。更具体地说,我们通过GCNS〜(GPGC)提出了GP回归模型,以进行基于图的半监督学习。在此过程中,我们以迭代分析形式制定了GPGC的内核矩阵计算。最后,我们根据图结构,观察到的节点的标签和所有节点的特征矩阵得出未观察到的节点标签的条件分布。我们进行了广泛的实验,以评估GPGC的半监督分类性能,并证明它在有效的同时,在所有数据集上都明确的余量优于其他最先进的方法。

Graph convolutional neural networks~(GCNs) have recently demonstrated promising results on graph-based semi-supervised classification, but little work has been done to explore their theoretical properties. Recently, several deep neural networks, e.g., fully connected and convolutional neural networks, with infinite hidden units have been proved to be equivalent to Gaussian processes~(GPs). To exploit both the powerful representational capacity of GCNs and the great expressive power of GPs, we investigate similar properties of infinitely wide GCNs. More specifically, we propose a GP regression model via GCNs~(GPGC) for graph-based semi-supervised learning. In the process, we formulate the kernel matrix computation of GPGC in an iterative analytical form. Finally, we derive a conditional distribution for the labels of unobserved nodes based on the graph structure, labels for the observed nodes, and the feature matrix of all the nodes. We conduct extensive experiments to evaluate the semi-supervised classification performance of GPGC and demonstrate that it outperforms other state-of-the-art methods by a clear margin on all the datasets while being efficient.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源