论文标题

强大的数据驱动的加速镜下降

Robust Data-Driven Accelerated Mirror Descent

论文作者

Tan, Hong Ye, Mukherjee, Subhadip, Tang, Junqi, Hauptmann, Andreas, Schönlieb, Carola-Bibiane

论文摘要

学习到优化是一个新兴框架,它利用培训数据来加快某些优化问题的解决方案。一种这样的方法是基于经典的镜像下降算法,在该算法中,镜像映射是使用输入 - 连接神经网络建模的。在这项工作中,我们通过基于经典加速镜下降将动量引入迭代,扩展了这种功能参数化方法。我们的方法结合了短时加速的收敛与稳定的长期行为。我们从经验上证明了有关降解和反向卷积实验的多个参数的额外鲁棒性。

Learning-to-optimize is an emerging framework that leverages training data to speed up the solution of certain optimization problems. One such approach is based on the classical mirror descent algorithm, where the mirror map is modelled using input-convex neural networks. In this work, we extend this functional parameterization approach by introducing momentum into the iterations, based on the classical accelerated mirror descent. Our approach combines short-time accelerated convergence with stable long-time behavior. We empirically demonstrate additional robustness with respect to multiple parameters on denoising and deconvolution experiments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源