论文标题

非参数最小二乘估计器的局部收敛速率,以及用于转移学习的应用

Local convergence rates of the nonparametric least squares estimator with applications to transfer learning

论文作者

Schmidt-Hieber, Johannes, Zamolodtchikov, Petr

论文摘要

经验风险最小化者的收敛性能可以方便地以相关的人群风险来表达。但是,要在协变量转移下的估计量性能得出界限,但是需要点侧的收敛速率。在对设计分布的假设较弱的情况下,表明在1-Lipschitz函数上的最小二乘估计量(LSE)也是最小值速率,相对于加权均匀的规范,在其中,加权以自然的方式说明了设计分布的非均匀性。这意味着,尽管最小二乘是一个全球标准,但LSE在当地适应了设计密度的大小。我们开发了一种新的间接证明技术,该技术基于精心选择的LSE局部扰动来建立局部收敛行为。然后,将获得的局部利率应用于分析协变量转移下转移学习的LSE。

Convergence properties of empirical risk minimizers can be conveniently expressed in terms of the associated population risk. To derive bounds for the performance of the estimator under covariate shift, however, pointwise convergence rates are required. Under weak assumptions on the design distribution, it is shown that least squares estimators (LSE) over 1-Lipschitz functions are also minimax rate optimal with respect to a weighted uniform norm, where the weighting accounts in a natural way for the non-uniformity of the design distribution. This implies that although least squares is a global criterion, the LSE adapts locally to the size of the design density. We develop a new indirect proof technique that establishes the local convergence behavior based on a carefully chosen local perturbation of the LSE. The obtained local rates are then applied to analyze the LSE for transfer learning under covariate shift.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源