论文标题

动态协作过滤Thompson抽样以跨域广告推荐

Dynamic collaborative filtering Thompson Sampling for cross-domain advertisements recommendation

论文作者

Ishikawa, Shion, Chung, Young-joo, Hirate, Yu

论文摘要

最近,在线广告商利用推荐系统(RSS)来显示广告来改善用户的参与度。上下文匪徒模型是一种广泛使用的RS,可利用和探索用户的参与度并最大化长期奖励,例如点击或转换。但是,当前模型旨在仅在特定域中优化一组广告,而不会与多个域中的其他模型共享信息。在本文中,我们提出了动态协作过滤汤普森采样(DCT),这是一种新颖而简单的模型,以在多种匪徒模型之间传递知识。 DCT利用用户和广告之间的相似性来估计汤普森采样的先前分布。此类相似之处是根据用户和广告的上下文功能获得的。相似性使在一个没有太多数据的域中的模型可以通过传输知识来更快地收敛。此外,DCT结合了用户的时间动态,以跟踪用户最近的首选项变化。我们首先显示传递知识并结合时间动力学提高了合成数据集上基线模型的性能。然后,我们对现实世界数据集进行了经验分析,结果表明,DCTS比最先进的模型提高了点击率9.7%。我们还分析了调整时间动力学和相似性的超参数,并显示最大化CTR的最佳参数。

Recently online advertisers utilize Recommender systems (RSs) for display advertising to improve users' engagement. The contextual bandit model is a widely used RS to exploit and explore users' engagement and maximize the long-term rewards such as clicks or conversions. However, the current models aim to optimize a set of ads only in a specific domain and do not share information with other models in multiple domains. In this paper, we propose dynamic collaborative filtering Thompson Sampling (DCTS), the novel yet simple model to transfer knowledge among multiple bandit models. DCTS exploits similarities between users and between ads to estimate a prior distribution of Thompson sampling. Such similarities are obtained based on contextual features of users and ads. Similarities enable models in a domain that didn't have much data to converge more quickly by transferring knowledge. Moreover, DCTS incorporates temporal dynamics of users to track the user's recent change of preference. We first show transferring knowledge and incorporating temporal dynamics improve the performance of the baseline models on a synthetic dataset. Then we conduct an empirical analysis on a real-world dataset and the result showed that DCTS improves click-through rate by 9.7% than the state-of-the-art models. We also analyze hyper-parameters that adjust temporal dynamics and similarities and show the best parameter which maximizes CTR.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源