论文标题

具有最佳收敛保证的显式二阶最小优化方法

Explicit Second-Order Min-Max Optimization Methods with Optimal Convergence Guarantee

论文作者

Lin, Tianyi, Mertikopoulos, Panayotis, Jordan, Michael I.

论文摘要

我们提出并分析了几种不精确的牛顿型方法,以查找\ emph {convex-concave}的全局鞍点}无约束的最低最大最大优化问题。与一阶方法相比,我们对二阶方法的最低最大优化方法的理解相对有限,因为获得二阶信息的全局收敛速率的涉及更多。在本文中,我们研究了如何使用二阶信息来加快额外梯度方法,即使在不确定的情况下也是如此。具体而言,我们表明所提出的方法生成的迭代保留在一个有限的集合中,并且平均迭代将收敛到$ o(ε^{ - 2/3})$迭代在受限的差距函数方面。这与在这种情况下的理论上建立的下限匹配。我们还提供了一个简单的例程,用于在每次迭代中解决子问题,需要单个SCHUR分解以及$ o(\ log \ log \ log(1/ε))$调用到Quasi-Upper-Triangular系统中的线性系统求解器。因此,我们的方法通过剃掉所需数量的Schur分解数量中的$ o(\ log \ log(1/ε))$因子来改善现有的基于线路搜索的二阶最小值优化方法。最后,我们介绍了有关综合和真实数据的数值实验,这些实验证明了所提出方法的效率。

We propose and analyze several inexact regularized Newton-type methods for finding a global saddle point of \emph{convex-concave} unconstrained min-max optimization problems. Compared to first-order methods, our understanding of second-order methods for min-max optimization is relatively limited, as obtaining global rates of convergence with second-order information is much more involved. In this paper, we examine how second-order information can be used to speed up extra-gradient methods, even under inexactness. Specifically, we show that the proposed methods generate iterates that remain within a bounded set and that the averaged iterates converge to an $ε$-saddle point within $O(ε^{-2/3})$ iterations in terms of a restricted gap function. This matched the theoretically established lower bound in this context. We also provide a simple routine for solving the subproblem at each iteration, requiring a single Schur decomposition and $O(\log\log(1/ε))$ calls to a linear system solver in a quasi-upper-triangular system. Thus, our method improves the existing line-search-based second-order min-max optimization methods by shaving off an $O(\log\log(1/ε))$ factor in the required number of Schur decompositions. Finally, we present numerical experiments on synthetic and real data that demonstrate the efficiency of the proposed methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源