论文标题

双向变压器的对比度学习,以进行顺序推荐

Contrastive Learning with Bidirectional Transformers for Sequential Recommendation

论文作者

Du, Hanwen, Shi, Hui, Zhao, Pengpeng, Wang, Deqing, Sheng, Victor S., Liu, Yanchi, Liu, Guanfeng, Zhao, Lei

论文摘要

与基于变压器的序列编码器的对比学习已获得序列建议的优势。它最大程度地提高了共享类似语义的配对序列增强之间的协议。但是,顺序推荐中的现有对比学习方法主要集中在从左到右的单向变压器作为基本编码器,这是次级优势的顺序建议,因为用户行为可能不是严格的左右顺序。为了解决这个问题,我们提出了一个新颖的框架,名为\ textbf {c}使用\ textbf {bi}方向\ teidemal \ textbf {t} ransformers进行顺序推荐(\ textbf {cbit})。具体而言,我们首先将幻灯片窗口技术应用于双向变压器中的长用户序列,从而可以对用户序列进行更细粒度的划分。然后,我们将固定的任务面膜和辍学掩码结合在一起,以生成高质量的阳性样本并执行多对对比度学习,与正常的一对对比度学习相比,它表现出更好的性能和适应性。此外,我们引入了一种新型的动态损失重新加权策略,以平衡固定的任务损失和对比度损失。三个公共基准数据集的实验结果表明,我们的模型优于最先进的模型,用于顺序建议。

Contrastive learning with Transformer-based sequence encoder has gained predominance for sequential recommendation. It maximizes the agreements between paired sequence augmentations that share similar semantics. However, existing contrastive learning approaches in sequential recommendation mainly center upon left-to-right unidirectional Transformers as base encoders, which are suboptimal for sequential recommendation because user behaviors may not be a rigid left-to-right sequence. To tackle that, we propose a novel framework named \textbf{C}ontrastive learning with \textbf{Bi}directional \textbf{T}ransformers for sequential recommendation (\textbf{CBiT}). Specifically, we first apply the slide window technique for long user sequences in bidirectional Transformers, which allows for a more fine-grained division of user sequences. Then we combine the cloze task mask and the dropout mask to generate high-quality positive samples and perform multi-pair contrastive learning, which demonstrates better performance and adaptability compared with the normal one-pair contrastive learning. Moreover, we introduce a novel dynamic loss reweighting strategy to balance between the cloze task loss and the contrastive loss. Experiment results on three public benchmark datasets show that our model outperforms state-of-the-art models for sequential recommendation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源