论文标题

经验非线性最小二乘的收敛范围

Convergence bounds for empirical nonlinear least-squares

论文作者

Eigel, Martin, Schneider, Reinhold, Trunschke, Philipp

论文摘要

我们考虑在非线性子集$ \ mathcal {m} $ banach functions $(\ Mathcal {V},\ | \ bulter \ |)$的最佳近似问题。假定规范是$ l^2 $ norm的概括,只能计算加权的蒙特卡洛估计$ \ | \ | _n $。目的是通过最小化经验规范$ \ | u-v \ | _n $来获得未知函数$ u \ in \ mathcal {v} $的近似$ v \。我们考虑了一般非线性子集的这个问题,并为经验最佳近似误差建立误差界限。我们的结果基于有限的等轴测特性(RIP),该特性具有概率,并且与非线性最小二乘设置无关。检查了几个模型类别,可以在其中进行有关RIP的分析陈述,并将结果与​​文献中现有的样本复杂性界限进行比较。我们发现,对于良好的模型类别,我们的一般界限较弱,但表现出与这些专业界限相同的属性。值得注意的是,我们证明了最佳采样密度(以线性空间为已知)的优势对于具有稀疏表示的函数集。

We consider best approximation problems in a nonlinear subset $\mathcal{M}$ of a Banach space of functions $(\mathcal{V},\|\bullet\|)$. The norm is assumed to be a generalization of the $L^2$-norm for which only a weighted Monte Carlo estimate $\|\bullet\|_n$ can be computed. The objective is to obtain an approximation $v\in\mathcal{M}$ of an unknown function $u \in \mathcal{V}$ by minimizing the empirical norm $\|u-v\|_n$. We consider this problem for general nonlinear subsets and establish error bounds for the empirical best approximation error. Our results are based on a restricted isometry property (RIP) which holds in probability and is independent of the nonlinear least squares setting. Several model classes are examined where analytical statements can be made about the RIP and the results are compared to existing sample complexity bounds from the literature. We find that for well-studied model classes our general bound is weaker but exhibits many of the same properties as these specialized bounds. Notably, we demonstrate the advantage of an optimal sampling density (as known for linear spaces) for sets of functions with sparse representations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源