论文标题

非凸优化的梯度方法线性收敛条件

Conditions for linear convergence of the gradient method for non-convex optimization

论文作者

Abbaszadehpeivasti, Hadi, de Klerk, Etienne, Zamani, Moslem

论文摘要

在本文中,我们为梯度方法得出了一个新的线性收敛速率,该方法具有固定的步长,用于未满足Polyak-lojasiewicz(PL)不平等的非凸平平滑优化问题。我们确定PL不平等是与此类问题的最佳价值线性收敛的必要条件。我们列出了一些相关类别的功能类别,梯度方法可以享受线性收敛率。此外,我们研究了他们与PL不平等的关系。

In this paper, we derive a new linear convergence rate for the gradient method with fixed step lengths for non-convex smooth optimization problems satisfying the Polyak-Lojasiewicz (PL) inequality. We establish that the PL inequality is a necessary and sufficient condition for linear convergence to the optimal value for this class of problems. We list some related classes of functions for which the gradient method may enjoy linear convergence rate. Moreover, we investigate their relationship with the PL inequality.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源