论文标题

经常推荐系统的撤退辍学

Recency Dropout for Recurrent Recommender Systems

论文作者

Chang, Bo, Xu, Can, Lê, Matthieu, Feng, Jingchen, Le, Ya, Badam, Sriraj, Chi, Ed, Chen, Minmin

论文摘要

经常推荐的系统已成功捕获用户活动轨迹中的时间动态。但是,已知复发性神经网络(RNN)在学习长期依赖性方面遇到困难。结果,基于RNN的推荐系统倾向于过度关注短期用户的兴趣。这被称为新近偏见,可能会对长期用户体验以及生态系统的健康状况产生负面影响。在本文中,我们介绍了新的辍学技术,这是一种简单而有效的数据增强技术,以减轻经常性推荐系统的重新偏差。我们证明了在各种实验环境中退缩的有效性,包括模拟研究,离线实验以及在大型工业推荐平台上进行的实时实验。

Recurrent recommender systems have been successful in capturing the temporal dynamics in users' activity trajectories. However, recurrent neural networks (RNNs) are known to have difficulty learning long-term dependencies. As a consequence, RNN-based recommender systems tend to overly focus on short-term user interests. This is referred to as the recency bias, which could negatively affect the long-term user experience as well as the health of the ecosystem. In this paper, we introduce the recency dropout technique, a simple yet effective data augmentation technique to alleviate the recency bias in recurrent recommender systems. We demonstrate the effectiveness of recency dropout in various experimental settings including a simulation study, offline experiments, as well as live experiments on a large-scale industrial recommendation platform.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源