论文标题

学习前进重用距离

Learning Forward Reuse Distance

论文作者

Li, Pengcheng, Gu, Yongbin

论文摘要

缓存技术被广泛用于从应用程序的云计算时代,例如网络缓存到基础架构,备忘录和计算机架构中的内存库。缓存数据的预测可以大大帮助改善高​​速缓存管理和性能。深度学习技术的最新进步使新型智能缓存替代政策的设计。在这项工作中,我们提出了一种学习辅助方法,以预测未来的数据访问。我们发现,功能强大的基于LSTM的复发神经网络模型只能基于缓存跟踪作为输入提供高预测精度。高精度是由精心制作的地方驱动的特征设计产生的。受到高预测准确性的启发,我们提出了伪选择策略,并在Microsoft Research的13个现实Word Storage工作负载上对其进行了评估。结果表明,新的缓存策略将最新的实际政策提高了19.2%,而失误比率仅高2.3%。

Caching techniques are widely used in the era of cloud computing from applications, such as Web caches to infrastructures, Memcached and memory caches in computer architectures. Prediction of cached data can greatly help improve cache management and performance. The recent advancement of deep learning techniques enables the design of novel intelligent cache replacement policies. In this work, we propose a learning-aided approach to predict future data accesses. We find that a powerful LSTM-based recurrent neural network model can provide high prediction accuracy based on only a cache trace as input. The high accuracy results from a carefully crafted locality-driven feature design. Inspired by the high prediction accuracy, we propose a pseudo OPT policy and evaluate it upon 13 real-world storage workloads from Microsoft Research. Results demonstrate that the new cache policy improves state-of-art practical policies by up to 19.2% and incurs only 2.3% higher miss ratio than OPT on average.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源