论文标题

随机梯度加速凸的优化:概括强增长条件

Accelerated Convex Optimization with Stochastic Gradients: Generalizing the Strong-Growth Condition

论文作者

Valls, Víctor, Wang, Shiqiang, Jiang, Yuang, Tassiulas, Leandros

论文摘要

本文为随机梯度提供了足够的条件,以免减慢Nesterov加速梯度方法的收敛性。新的条件具有Schmidt \&Roux作为特殊情况的强劲增长条件,它还使我们能够(i)建模具有约束的问题,以及(ii)设计新型的甲壳类型(例如,用于诸如SAGA之类的有限和有限问题的Oracles)。我们的结果是通过重新访问Nesterov的加速算法而获得的,并且对于设计随机甲壳而不更改基础一阶方法很有用。

This paper presents a sufficient condition for stochastic gradients not to slow down the convergence of Nesterov's accelerated gradient method. The new condition has the strong-growth condition by Schmidt \& Roux as a special case, and it also allows us to (i) model problems with constraints and (ii) design new types of oracles (e.g., oracles for finite-sum problems such as SAGA). Our results are obtained by revisiting Nesterov's accelerated algorithm and are useful for designing stochastic oracles without changing the underlying first-order method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源