论文标题

用于冷启动用户和项目表示的预训练图神经网络

Pre-Training Graph Neural Networks for Cold-Start Users and Items Representation

论文作者

Hao, Bowen, Zhang, Jing, Yin, Hongzhi, Li, Cuiping, Chen, Hong

论文摘要

冷启动问题是建议任务的根本挑战。尽管最新的图形神经网络(GNN)的进步还包含了高阶协作信号以减轻该问题,但冷启动用户和项目的嵌入未明确优化,并且在GNNS中的图形卷积期间,冷启动的邻居也没有处理。本文提议在将其应用于建议之前,先预先培训GNN模型。与建议的目标不同,预训练的GNN通过足够的交互模拟了从用户/项目的冷启动场景,并将嵌入重建作为借口任务,从而可以直接提高嵌入质量,并可以轻松地适应新的冷启动用户/项目。为了进一步降低冷启动邻居的影响,我们结合了一个基于自我注意力的元聚合器,以增强每个图卷积步骤的聚合能力,以及一个自适应邻居抽样器,根据预培训GNN模型的反馈选择有效的邻居。三个公共建议数据集的实验显示了我们预培训GNN模型的优越性,而不是原始的GNN模型在用户/项目嵌入推理和建议任务上的优势。

Cold-start problem is a fundamental challenge for recommendation tasks. Despite the recent advances on Graph Neural Networks (GNNs) incorporate the high-order collaborative signal to alleviate the problem, the embeddings of the cold-start users and items aren't explicitly optimized, and the cold-start neighbors are not dealt with during the graph convolution in GNNs. This paper proposes to pre-train a GNN model before applying it for recommendation. Unlike the goal of recommendation, the pre-training GNN simulates the cold-start scenarios from the users/items with sufficient interactions and takes the embedding reconstruction as the pretext task, such that it can directly improve the embedding quality and can be easily adapted to the new cold-start users/items. To further reduce the impact from the cold-start neighbors, we incorporate a self-attention-based meta aggregator to enhance the aggregation ability of each graph convolution step, and an adaptive neighbor sampler to select the effective neighbors according to the feedbacks from the pre-training GNN model. Experiments on three public recommendation datasets show the superiority of our pre-training GNN model against the original GNN models on user/item embedding inference and the recommendation task.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源