论文标题

二次最小化:从结合梯度到具有polyak台阶尺寸的自适应重球方法

Quadratic minimization: from conjugate gradient to an adaptive Heavy-ball method with Polyak step-sizes

论文作者

Goujaud, Baptiste, Taylor, Adrien, Dieuleveut, Aymeric

论文摘要

在这项工作中,我们提出了对经典重型球方法的自适应变化,以凸出二次最小化。适应性至关重要的是依赖于所谓的“ polyak台阶”,其中包括使用手头优化问题的最佳值而不是问题参数,例如问题的Hessian的一些特征值。该方法也恰好等同于经典共轭梯度方法的变体,从而继承了其许多吸引人的特征,包括其有限的时间收敛,实例最佳性和最差的案例收敛速率。 已知具有Polyak台阶尺寸的经典梯度方法在可以使用的情况下表现良好,并且是否可能将动量纳入该方法,并且可以改善该方法本身似乎是开放的。我们为这个问题提供了一个确定的答案,以最大程度地减少凸二次功能,这是在更一般的设置中开发此类方法的必要第一步。

In this work, we propose an adaptive variation on the classical Heavy-ball method for convex quadratic minimization. The adaptivity crucially relies on so-called "Polyak step-sizes", which consists in using the knowledge of the optimal value of the optimization problem at hand instead of problem parameters such as a few eigenvalues of the Hessian of the problem. This method happens to also be equivalent to a variation of the classical conjugate gradient method, and thereby inherits many of its attractive features, including its finite-time convergence, instance optimality, and its worst-case convergence rates. The classical gradient method with Polyak step-sizes is known to behave very well in situations in which it can be used, and the question of whether incorporating momentum in this method is possible and can improve the method itself appeared to be open. We provide a definitive answer to this question for minimizing convex quadratic functions, a arguably necessary first step for developing such methods in more general setups.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源