论文标题

自适应梯度方法,用于受约束的凸优化和变异不等式

Adaptive Gradient Methods for Constrained Convex Optimization and Variational Inequalities

论文作者

Ene, Alina, Nguyen, Huy L., Vladu, Adrian

论文摘要

我们为约束凸优化提供了新的自适应一阶方法。我们的主要算法ADAACSA和ADAAGD+是加速的方法,即使它们仅能访问随机梯度,它们即使它们仅获得平滑和非平滑函数的效果,它们的平滑和非平滑函数都几乎是优美的收敛速率。此外,由于它们会自动调整其每个坐标学习率,因此它们不需要任何有关目标函数如何参数化的知识。这些可以看作是真正加速的Adagrad方法,以进行约束优化。 我们使用较简单的算法Adagrad+对它们进行补充,该算法享受相同的功能,并达到标准的非加速收敛速率。我们还提出了一组新的结果,涉及用于无约束优化和单调操作员的自适应方法。

We provide new adaptive first-order methods for constrained convex optimization. Our main algorithms AdaACSA and AdaAGD+ are accelerated methods, which are universal in the sense that they achieve nearly-optimal convergence rates for both smooth and non-smooth functions, even when they only have access to stochastic gradients. In addition, they do not require any prior knowledge on how the objective function is parametrized, since they automatically adjust their per-coordinate learning rate. These can be seen as truly accelerated Adagrad methods for constrained optimization. We complement them with a simpler algorithm AdaGrad+ which enjoys the same features, and achieves the standard non-accelerated convergence rate. We also present a set of new results involving adaptive methods for unconstrained optimization and monotone operators.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源