论文标题

IRL的全​​球线性和局部超级线性收敛,用于非平滑鲁棒回归

Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression

论文作者

Peng, Liangzu, Kümmerle, Christian, Vidal, René

论文摘要

我们通过使用迭代重新加权的最小二乘(IRLS)的新颖变体来解决基本的非平滑问题。在convex案例中,$ p = 1 $,我们在全球范围内确定互惠率的范围,我们可以在(0,1] $ in(0,1)$中推进(0,1)$的理论和实践。矩阵称为\ textIt {稳定范围属性},在非凸件中,$ p \ in(0,1)$,我们证明,在类似的条件下,irls以$ 2-p $ question coption coption coption cosection coption coption coption coption coption and coption and coption and coption and coption and coption and Inters and thress and offection。强大的脸部恢复表明,(1)IRL可以处理更多的离群值,(2)它比在同一准确度上的竞争方法更快,(3)它以令人满意的视觉质量恢复了稀疏的脸部图像

We advance both the theory and practice of robust $\ell_p$-quasinorm regression for $p \in (0,1]$ by using novel variants of iteratively reweighted least-squares (IRLS) to solve the underlying non-smooth problem. In the convex case, $p=1$, we prove that this IRLS variant converges globally at a linear rate under a mild, deterministic condition on the feature matrix called the \textit{stable range space property}. In the non-convex case, $p\in(0,1)$, we prove that under a similar condition, IRLS converges locally to the global minimizer at a superlinear rate of order $2-p$; the rate becomes quadratic as $p\to 0$. We showcase the proposed methods in three applications: real phase retrieval, regression without correspondences, and robust face restoration. The results show that (1) IRLS can handle a larger number of outliers than other methods, (2) it is faster than competing methods at the same level of accuracy, (3) it restores a sparsely corrupted face image with satisfactory visual quality. https://github.com/liangzu/IRLS-NeurIPS2022

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源