论文标题
Metapath-和实体感知的图形神经网络供推荐
Metapath- and Entity-aware Graph Neural Network for Recommendation
论文作者
论文摘要
在图形神经网络(GNN)中,消息传递迭代的节点从其直接邻居中的信息汇总,同时忽略了多跳节点连接的顺序性质。这样的顺序节点连接,例如Metapaths,捕获下游任务的关键见解。具体而言,在推荐系统(RSS)中,无视这些见解会导致协作信号的蒸馏不足。在本文中,我们采用协作子图(CSG)和Metapaths形成Metapath-Aware Subgraphs,这些子图明确捕获了图结构中的顺序语义。 We propose meta\textbf{P}ath and \textbf{E}ntity-\textbf{A}ware \textbf{G}raph \textbf{N}eural \textbf{N}etwork (PEAGNN), which trains multilayer GNNs to perform metapath-aware information aggregation on such subgraphs.然后使用注意机制融合来自不同元数据的总汇总信息。最后,Peagnn为我们提供了节点和子图的表示,可用于训练MLP,以预测目标用户项目对的得分。为了利用CSG的局部结构,我们提出了实体意识,在节点嵌入中充当对比的正常化程序。此外,PEAGNN可以与GAT,GCN和图形等突出层结合使用。我们的经验评估表明,我们提出的技术在几个数据集上优于竞争基准,以完成建议任务。进一步的分析表明,Peagnn还从给定的一组Metapath中学习了有意义的Metapath组合。
In graph neural networks (GNNs), message passing iteratively aggregates nodes' information from their direct neighbors while neglecting the sequential nature of multi-hop node connections. Such sequential node connections e.g., metapaths, capture critical insights for downstream tasks. Concretely, in recommender systems (RSs), disregarding these insights leads to inadequate distillation of collaborative signals. In this paper, we employ collaborative subgraphs (CSGs) and metapaths to form metapath-aware subgraphs, which explicitly capture sequential semantics in graph structures. We propose meta\textbf{P}ath and \textbf{E}ntity-\textbf{A}ware \textbf{G}raph \textbf{N}eural \textbf{N}etwork (PEAGNN), which trains multilayer GNNs to perform metapath-aware information aggregation on such subgraphs. This aggregated information from different metapaths is then fused using attention mechanism. Finally, PEAGNN gives us the representations for node and subgraph, which can be used to train MLP for predicting score for target user-item pairs. To leverage the local structure of CSGs, we present entity-awareness that acts as a contrastive regularizer on node embedding. Moreover, PEAGNN can be combined with prominent layers such as GAT, GCN and GraphSage. Our empirical evaluation shows that our proposed technique outperforms competitive baselines on several datasets for recommendation tasks. Further analysis demonstrates that PEAGNN also learns meaningful metapath combinations from a given set of metapaths.