论文标题
舞台凸优化问题的有效非凸的重新印度
An efficient nonconvex reformulation of stagewise convex optimization problems
论文作者
论文摘要
与分期结构的凸优化问题出现在几种情况下,包括最佳控制,深神经网络的验证和等渗回归。现成的求解器可以解决这些问题,但扩展可能会很差。我们开发了一种非洞穴重新印象,旨在利用这种分阶段的结构。我们的重新制定只有简单的约束,可以通过预测的梯度方法及其加速变体实现解决方案。该方法自动为原始凸问题生成了一系列原始和双重可行解决方案,使最佳认证变得容易。我们建立了非Convex公式的理论特性,表明它(几乎)不含虚假的局部最小值,并且具有与凸问题相同的全局最佳。我们修改PGD以避免杂乱无章的本地最小化器,因此它始终会收敛到全球最小化器。对于神经网络验证,我们的方法仅在几个梯度步骤中获得了微小的双重性差距。因此,它可以快速解决大规模验证问题的速度,而不是现成的和专业的求解器。
Convex optimization problems with staged structure appear in several contexts, including optimal control, verification of deep neural networks, and isotonic regression. Off-the-shelf solvers can solve these problems but may scale poorly. We develop a nonconvex reformulation designed to exploit this staged structure. Our reformulation has only simple bound constraints, enabling solution via projected gradient methods and their accelerated variants. The method automatically generates a sequence of primal and dual feasible solutions to the original convex problem, making optimality certification easy. We establish theoretical properties of the nonconvex formulation, showing that it is (almost) free of spurious local minima and has the same global optimum as the convex problem. We modify PGD to avoid spurious local minimizers so it always converges to the global minimizer. For neural network verification, our approach obtains small duality gaps in only a few gradient steps. Consequently, it can quickly solve large-scale verification problems faster than both off-the-shelf and specialized solvers.