论文标题

外部方法中的相对Lipschitzness和直接加速的配方

Relative Lipschitzness in Extragradient Methods and a Direct Recipe for Acceleration

论文作者

Cohen, Michael B., Sidford, Aaron, Tian, Kevin

论文摘要

我们表明,标准的外部方法(即镜像代理和双重推断)恢复了平滑凸功能的一阶最小化最小化的最佳加速率。为了获得此结果,我们提供了以自然条件为例,我们称之为相对Lipschitzness的外部方法的收敛速率进行了细粒度表征。我们进一步概括了该框架,以处理相对Lipschitzness的局部和随机概念,从而根据面积凸度和复杂性界限恢复了盒子受限的$ \ ell_ \ infty $回归,通过加速(随机)坐标达到的复杂性界限,以最小化的平滑凸功能。

We show that standard extragradient methods (i.e. mirror prox and dual extrapolation) recover optimal accelerated rates for first-order minimization of smooth convex functions. To obtain this result we provide a fine-grained characterization of the convergence rates of extragradient methods for solving monotone variational inequalities in terms of a natural condition we call relative Lipschitzness. We further generalize this framework to handle local and randomized notions of relative Lipschitzness and thereby recover rates for box-constrained $\ell_\infty$ regression based on area convexity and complexity bounds achieved by accelerated (randomized) coordinate descent for smooth convex function minimization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源