论文标题
通过图上下文感知节点表示,保留密集的图形卷积网络的局部性
Locality Preserving Dense Graph Convolutional Networks with Graph Context-Aware Node Representations
论文作者
论文摘要
图形卷积网络(GCN)已被广泛用于表示图数据上的表示,可以通过专门设计的卷积和读取操作在图上捕获结构模式。在许多图形分类应用程序中,基于GCN的方法的表现优于传统方法。但是,大多数现有的GCN效率低下来保留图形的本地信息 - 这是图形分类尤其有问题的限制。在这项工作中,我们提出了一个具有图形感知节点表示形式的局部保存密集的GCN。具体而言,我们提出的模型结合了局部节点特征重建模块,以将初始节点特征保存到节点表示中,该模块通过简单但有效的编码器解码器机制实现。为了捕获代表不同地方范围的社区中的局部结构模式,引入了密集的连接性,以将每个卷积层及其相应的读数与所有以前的卷积层连接起来。为了增强节点代表性,每个卷积层的输出与上一层读数的输出相连,以形成全局上下文感知的节点表示。此外,引入一个自我发项式模块以汇总层表示以形成最终表示。基准数据集上的实验证明了所提出的模型优于最先进的方法,就分类精度而言。
Graph convolutional networks (GCNs) have been widely used for representation learning on graph data, which can capture structural patterns on a graph via specifically designed convolution and readout operations. In many graph classification applications, GCN-based approaches have outperformed traditional methods. However, most of the existing GCNs are inefficient to preserve local information of graphs -- a limitation that is especially problematic for graph classification. In this work, we propose a locality-preserving dense GCN with graph context-aware node representations. Specifically, our proposed model incorporates a local node feature reconstruction module to preserve initial node features into node representations, which is realized via a simple but effective encoder-decoder mechanism. To capture local structural patterns in neighbourhoods representing different ranges of locality, dense connectivity is introduced to connect each convolutional layer and its corresponding readout with all previous convolutional layers. To enhance node representativeness, the output of each convolutional layer is concatenated with the output of the previous layer's readout to form a global context-aware node representation. In addition, a self-attention module is introduced to aggregate layer-wise representations to form the final representation. Experiments on benchmark datasets demonstrate the superiority of the proposed model over state-of-the-art methods in terms of classification accuracy.