论文标题

安德森的加速梯度方法具有能量以进行优化问题

Anderson acceleration of gradient methods with energy for optimization problems

论文作者

Liu, Hailiang, He, Jia-Hao, Tian, Xuping

论文摘要

Anderson加速度(AA)是一种有效的技术,用于加速定点迭代的收敛性,可以设计用于加速优化方法。我们通过将安德森加速度适应能量自适应梯度法(AEGD)[ARXIV:2010.05109]提出了一种新颖的优化算法。根据AEGD的收敛结果,考察了我们算法的可行性,尽管它不是定点迭代。我们还通过在安德森混合的每种实现时量化了梯度下降的AA的加速收敛速率。我们的实验结果表明,所提出的算法几乎不需要对超参数进行调整,并且表现出较高的快速收敛。

Anderson acceleration (AA) as an efficient technique for speeding up the convergence of fixed-point iterations may be designed for accelerating an optimization method. We propose a novel optimization algorithm by adapting Anderson acceleration to the energy adaptive gradient method (AEGD) [arXiv:2010.05109]. The feasibility of our algorithm is examined in light of convergence results for AEGD, though it is not a fixed-point iteration. We also quantify the accelerated convergence rate of AA for gradient descent by a factor of the gain at each implementation of the Anderson mixing. Our experimental results show that the proposed algorithm requires little tuning of hyperparameters and exhibits superior fast convergence.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源