论文标题

用于全局优化的代数收敛随机梯度下降算法

An Algebraically Converging Stochastic Gradient Descent Algorithm for Global Optimization

论文作者

Engquist, Björn, Ren, Kui, Yang, Yunan

论文摘要

我们提出了一种新的梯度下降算法,并带有随机术语,用于查找非凸优化问题的全局优化器。算法中的关键组成部分是基于目标函数值的随机性自适应调整。用模拟退火的语言,温度取决于状态。这样,我们证明了该算法的全局收敛性在概率和参数空间中都具有代数速率。这是对使用对噪声项的更直接控制的经典速率的重大改进。收敛证明是基于算法的实际离散设置,而不仅仅是文献中经常进行的连续限制。我们还提供了几个数值示例,以证明算法的效率和鲁棒性对于合理复杂的目标函数。

We propose a new gradient descent algorithm with added stochastic terms for finding the global optimizers of nonconvex optimization problems. A key component in the algorithm is the adaptive tuning of the randomness based on the value of the objective function. In the language of simulated annealing, the temperature is state-dependent. With this, we prove the global convergence of the algorithm with an algebraic rate both in probability and in the parameter space. This is a significant improvement over the classical rate from using a more straightforward control of the noise term. The convergence proof is based on the actual discrete setup of the algorithm, not just its continuous limit as often done in the literature. We also present several numerical examples to demonstrate the efficiency and robustness of the algorithm for reasonably complex objective functions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源