论文标题

通过深入挖掘用户内容检索中的顺序模式的预测边缘缓存

Predictive Edge Caching through Deep Mining of Sequential Patterns in User Content Retrievals

论文作者

Li, Chen, Wang, Xiaoyu, Zong, Tongyu, Cao, Houwei, Liu, Yong

论文摘要

Edge缓存在促进用户内容检索性能的同时减少冗余网络流量方面起着越来越重要的作用。缓存的有效性最终取决于在不久的将来预测内容流行的准确性。但是,在网络边缘,由于不同的用户内容检索行为和用户多路复用的低度,内容流行度可能非常动态。对于传统的反应式缓存系统来说,要跟上动态内容流行模式是一项挑战。在本文中,我们提出了一种新型的预测边缘缓存(PEC)系统,该系统使用细粒度的学习模型预测未来内容的流行,以挖掘用户内容检索行为中的顺序模式,并提供机会性的预测内容,预计将在未来的闲置网络中使用闲置网络进行预测。通过由实际内容检索轨迹驱动的广泛实验,我们证明了PEC可以适应高度动态的内容流行,并显着提高高速缓存率,并减少对先进的缓存策略的用户内容检索延迟。更广泛地说,我们的研究表明,通过深入挖掘用户内容检索行为可以提高边缘缓存性能。

Edge caching plays an increasingly important role in boosting user content retrieval performance while reducing redundant network traffic. The effectiveness of caching ultimately hinges on the accuracy of predicting content popularity in the near future. However, at the network edge, content popularity can be extremely dynamic due to diverse user content retrieval behaviors and the low-degree of user multiplexing. It's challenging for the traditional reactive caching systems to keep up with the dynamic content popularity patterns. In this paper, we propose a novel Predictive Edge Caching (PEC) system that predicts the future content popularity using fine-grained learning models that mine sequential patterns in user content retrieval behaviors, and opportunistically prefetches contents predicted to be popular in the near future using idle network bandwidth. Through extensive experiments driven by real content retrieval traces, we demonstrate that PEC can adapt to highly dynamic content popularity, and significantly improve cache hit ratio and reduce user content retrieval latency over the state-of-art caching policies. More broadly, our study demonstrates that edge caching performance can be boosted by deep mining of user content retrieval behaviors.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源