论文标题

无客观功能优化正规化算法的收敛属性,包括$ \ MATHCAL {o}(ε^{ - 3/2})$复杂度绑定

Convergence properties of an Objective-Function-Free Optimization regularization algorithm, including an $\mathcal{O}(ε^{-3/2})$ complexity bound

论文作者

Gratton, S., Jerad, S., Toint, Ph. L.

论文摘要

提出了一种无约束的非凸优化的自适应正则化算法,其中从未评估目标函数,但仅使用衍生物。该算法属于自适应正则化方法的类别,对于评估目标函数的标准框架,最佳的最差案例复杂性结果是已知的。本文显示,尽管使用了明显较少的信息,但这些出色的复杂性界限也适用于新算法。 In particular, it is shown that, if derivatives of degree one to $p$ are used, the algorithm will find a $ε_1$-approximate first-order minimizer in at most $O(ε_1^{-(p+1)/p})$ iterations, and an $(ε_1,ε_2)$-approximate second-order minimizer in at most $ o(\ max [ε^{ - (p+1)/p},ε_2^{ - (p+1)/(p-1)/(p-1)}])$迭代。作为一种特殊情况,使用第一和第二个衍生物的新算法(应用于Lipschitz Channuel Hessian的功能),将找到一个迭代$ x_k $,该梯度的标准最多小于$ε_1$,最多$ o(ε_1^{ - 3/2})$ tererations $ iterations $ iterations $ iterations $ iterations $ iterations $ iterations。

An adaptive regularization algorithm for unconstrained nonconvex optimization is presented in which the objective function is never evaluated, but only derivatives are used. This algorithm belongs to the class of adaptive regularization methods, for which optimal worst-case complexity results are known for the standard framework where the objective function is evaluated. It is shown in this paper that these excellent complexity bounds are also valid for the new algorithm, despite the fact that significantly less information is used. In particular, it is shown that, if derivatives of degree one to $p$ are used, the algorithm will find a $ε_1$-approximate first-order minimizer in at most $O(ε_1^{-(p+1)/p})$ iterations, and an $(ε_1,ε_2)$-approximate second-order minimizer in at most $O(\max[ε^{-(p+1)/p},ε_2^{-(p+1)/(p-1)}])$ iterations. As a special case, the new algorithm using first and second derivatives, when applied to functions with Lipschitz continuous Hessian, will find an iterate $x_k$ at which the gradient's norm is less than $ε_1$ in at most $O(ε_1^{-3/2})$ iterations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源