论文标题

亚级别下降的不确定性量化,并应用于离散问题的放松

Uncertainty quantification for subgradient descent, with applications to relaxations of discrete problems

论文作者

McMeel, Conor, Parpas, Panos

论文摘要

我们考虑最小化凸函数的问题,该凸功能取决于不确定的参数$θ$。目标函数的不确定性意味着最佳$ x^*(θ)$也是$θ$的函数。我们提出了一种有效的方法来计算$ x^*(θ)$及其统计信息。我们在截断的基础上使用$ x^*(θ)$的混乱扩展,并研究一种重新启动的亚级别方法来计算最佳系数。随着基本函数的数量增加,我们建立了该方法的收敛速率,因此增加了优化问题的维度。 我们为亚级别下降提供了非反应收敛速率,这是基于较早的工作,该工作着眼于梯度和加速梯度下降。此外,这项工作明确处理了预测问题,并提出了一种处理非平凡预测的方法。我们展示了如何通过利用最小s,t-cut图问题的(凸)lovasz扩展来使用该算法来量化离散问题中的不确定性。

We consider the problem of minimizing a convex function that depends on an uncertain parameter $θ$. The uncertainty in the objective function means that the optimum, $x^*(θ)$, is also a function of $θ$. We propose an efficient method to compute $x^*(θ)$ and its statistics. We use a chaos expansion of $x^*(θ)$ along a truncated basis and study a restarted subgradient method that compute the optimal coefficients. We establish the convergence rate of the method as the number of basis functions increases, and hence the dimensionality of the optimization problem is increased. We give a non-asymptotic convergence rate for subgradient descent, building on earlier work that looked at gradient and accelerated gradient descent. Additionally, this work explicitly deals with the issue of projections, and suggests a method to deal with non-trivial projections. We show how this algorithm can be used to quantify uncertainty in discrete problems by utilising the (convex) Lovasz Extension for the min s,t-cut graph problem.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源