论文标题

个性化提示进行顺序推荐

Personalized Prompt for Sequential Recommendation

论文作者

Wu, Yiqing, Xie, Ruobing, Zhu, Yongchun, Zhuang, Fuzhen, Zhang, Xu, Lin, Leyu, He, Qing

论文摘要

训练前模型在顺序建议中显示了其力量。最近,在NLP预训练中进行了广泛的探索和验证,该提示可以有助于更有效,有效地从训练模型中提取有用的知识,以实现下游任务,尤其是在冷启动场景中。但是,将NLP迅速调整到建议是一项挑战,因为建议(即项目)中的代币没有明确的解释语义,并且应该个性化序列建模。在这项工作中,我们首先介绍提示并提出一个新型的基于个性化的及时推荐(PPR)框架,以进行冷启动建议。具体来说,我们根据用户配置文件通过提示生成器来构建个性化的软前缀提示,并通过及时的及时和基于行为的增强来通过及时面向的对比度学习进行足够的培训。我们对各种任务进行广泛的评估。在几次射击和零射线建议中,PPR模型在三个大规模开放数据集中的各种指标上都具有显着改进。我们还进行消融测试和稀疏分析,以更好地了解PPR。此外,我们进一步验证了PPR在不同的培训模型上的普遍性,并对PPR的其他有希望的下游任务进行探索,包括跨域建议和用户配置文件预测。

Pre-training models have shown their power in sequential recommendation. Recently, prompt has been widely explored and verified for tuning in NLP pre-training, which could help to more effectively and efficiently extract useful knowledge from pre-training models for downstream tasks, especially in cold-start scenarios. However, it is challenging to bring prompt-tuning from NLP to recommendation, since the tokens in recommendation (i.e., items) do not have explicit explainable semantics, and the sequence modeling should be personalized. In this work, we first introduces prompt to recommendation and propose a novel Personalized prompt-based recommendation (PPR) framework for cold-start recommendation. Specifically, we build the personalized soft prefix prompt via a prompt generator based on user profiles and enable a sufficient training of prompts via a prompt-oriented contrastive learning with both prompt- and behavior-based augmentations. We conduct extensive evaluations on various tasks. In both few-shot and zero-shot recommendation, PPR models achieve significant improvements over baselines on various metrics in three large-scale open datasets. We also conduct ablation tests and sparsity analysis for a better understanding of PPR. Moreover, We further verify PPR's universality on different pre-training models, and conduct explorations on PPR's other promising downstream tasks including cross-domain recommendation and user profile prediction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源