论文标题

TPGNN:通过时间传播在动态图中学习高阶信息

TPGNN: Learning High-order Information in Dynamic Graphs via Temporal Propagation

论文作者

Wang, Zehong, Li, Qi, Yu, Donghua

论文摘要

时间图是用于建模由不断发展的相互作用元素组成的动态系统的抽象。在本文中,我们旨在解决一个重要但被忽视的问题 - 如何从时间表中从高阶邻居那里学习信息? - 增强学习节点表示的信息性和歧视性。我们认为,当从时间图中学习高级信息时,我们会遇到两个挑战,即计算效率低下和过度光滑的挑战,这些挑战无法通过在静态图上应用的常规技术来解决。为了解决这些缺陷,我们提出了一个基于时间传播的图形神经网络,即TPGNN。具体而言,该模型由两个不同的组件组成,即传播器和节点范围编码器。将传播器借用将消息从锚节点传播到$ k $ -HOP内的暂时邻居,然后同时更新社区状态,这可以有效地计算,尤其是对于深层模型。此外,为了防止过度光滑,该模型从$ n $ -HOP邻居迫使消息更新$ N $ -HOP内存向量保留在锚点上。节点编码器采用变压器体系结构来学习节点表示,通过明确学习保存在节点本身上的内存向量的重要性,即隐含地对邻居在不同层的邻居的重要性进行建模,从而减轻过度平滑的态度。由于编码过程不会查询时间邻居,因此我们可以在推理中大大节省时间消耗。关于时间链接预测和节点分类的广泛实验表明,TPGNN优于最先进的基准在效率和鲁棒性方面的优势。

Temporal graph is an abstraction for modeling dynamic systems that consist of evolving interaction elements. In this paper, we aim to solve an important yet neglected problem -- how to learn information from high-order neighbors in temporal graphs? -- to enhance the informativeness and discriminativeness for the learned node representations. We argue that when learning high-order information from temporal graphs, we encounter two challenges, i.e., computational inefficiency and over-smoothing, that cannot be solved by conventional techniques applied on static graphs. To remedy these deficiencies, we propose a temporal propagation-based graph neural network, namely TPGNN. To be specific, the model consists of two distinct components, i.e., propagator and node-wise encoder. The propagator is leveraged to propagate messages from the anchor node to its temporal neighbors within $k$-hop, and then simultaneously update the state of neighborhoods, which enables efficient computation, especially for a deep model. In addition, to prevent over-smoothing, the model compels the messages from $n$-hop neighbors to update the $n$-hop memory vector preserved on the anchor. The node-wise encoder adopts transformer architecture to learn node representations by explicitly learning the importance of memory vectors preserved on the node itself, that is, implicitly modeling the importance of messages from neighbors at different layers, thus mitigating the over-smoothing. Since the encoding process will not query temporal neighbors, we can dramatically save time consumption in inference. Extensive experiments on temporal link prediction and node classification demonstrate the superiority of TPGNN over state-of-the-art baselines in efficiency and robustness.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源