论文标题

Hopgat:Hop-Wawawaweard Mandision Graph Goarting Goarting网络稀疏标记的图形

HopGAT: Hop-aware Supervision Graph Attention Networks for Sparsely Labeled Graphs

论文作者

Ji, Chaojie, Wang, Ruxin, Zhu, Rongxiang, Cai, Yunpeng, Wu, Hongyan

论文摘要

由于标记节点的成本,在保持稀疏的图表中对节点进行分类,同时保持预测准确性值得关注。关键点是该算法如何从更多的跳跃距离的邻居那里学习足够的信息。这项研究首先提出了一种洞察力的注意力监督机制,以实现节点分类任务。然后采用模拟的退火学习策略,以平衡沿培训时间表的两个学习任务,节点分类和引人注目的注意力系数。与最先进的模型相比,实验结果证明了拟议的Hop-Aware Massision Graph注意力网络(HOPGAT)模型的出色有效性。尤其是,对于蛋白质 - 蛋白质相互作用网络,与完全标记的图相比,在40%标记的图中,性能损失仅为98.5%至94.6%。广泛的实验还证明了受监督的注意系数和学习策略的有效性。

Due to the cost of labeling nodes, classifying a node in a sparsely labeled graph while maintaining the prediction accuracy deserves attention. The key point is how the algorithm learns sufficient information from more neighbors with different hop distances. This study first proposes a hop-aware attention supervision mechanism for the node classification task. A simulated annealing learning strategy is then adopted to balance two learning tasks, node classification and the hop-aware attention coefficients, along the training timeline. Compared with state-of-the-art models, the experimental results proved the superior effectiveness of the proposed Hop-aware Supervision Graph Attention Networks (HopGAT) model. Especially, for the protein-protein interaction network, in a 40% labeled graph, the performance loss is only 3.9%, from 98.5% to 94.6%, compared to the fully labeled graph. Extensive experiments also demonstrate the effectiveness of supervised attention coefficient and learning strategies.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源