论文标题

核脊回归的尖锐渐近造型超出线性状态

Sharp Asymptotics of Kernel Ridge Regression Beyond the Linear Regime

论文作者

Hu, Hong, Lu, Yue M.

论文摘要

内核岭回归(KRR)的概括性能表现出一种多型模式,至关重要的是取决于样本尺寸$ n $与基础尺寸$ d $之间的缩放关系。这种现象是由于KRR顺序学习随着样本量增加而增加复杂性的功能。当$ d^{k-1} \ ll n \ ll d^{k} $时,仅学习了小于$ k $的多项式。在本文中,我们介绍了$ n \ asymp d^k $在关键过渡区域的性能的尖锐渐近表征,以$ k \ in \ mathbb {z}^{+} $中的$ k \。我们的渐近表征提供了整个学习过程的精确图片,并阐明了各种参数(包括选择内核函数)对概括性能的影响。特别是,我们表明KRR的学习曲线可以具有微妙的“双重下降”行为,这是由于在不同的多项式缩放制度下的特定偏见变化权衡。

The generalization performance of kernel ridge regression (KRR) exhibits a multi-phased pattern that crucially depends on the scaling relationship between the sample size $n$ and the underlying dimension $d$. This phenomenon is due to the fact that KRR sequentially learns functions of increasing complexity as the sample size increases; when $d^{k-1}\ll n\ll d^{k}$, only polynomials with degree less than $k$ are learned. In this paper, we present sharp asymptotic characterization of the performance of KRR at the critical transition regions with $n \asymp d^k$, for $k\in\mathbb{Z}^{+}$. Our asymptotic characterization provides a precise picture of the whole learning process and clarifies the impact of various parameters (including the choice of the kernel function) on the generalization performance. In particular, we show that the learning curves of KRR can have a delicate "double descent" behavior due to specific bias-variance trade-offs at different polynomial scaling regimes.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源