论文标题

神经切线内核:一项调查

Neural Tangent Kernel: A Survey

论文作者

Golikov, Eugene, Pokonechnyy, Eduard, Korviakov, Vladimir

论文摘要

一项开创性的工作[Jacot等,2018]表明,在特定参数化下训练神经网络等同于在宽度到Infinity时执行特定的内核方法。这种等效性为将有关内核方法的丰富文献结果应用于神经网的结果开辟了一个有希望的方向,而神经网络很难解决。本调查涵盖了内核融合的关键结果,因为宽度延伸到无穷大,有限宽度校正,应用以及对相应方法的局限性的讨论。

A seminal work [Jacot et al., 2018] demonstrated that training a neural network under specific parameterization is equivalent to performing a particular kernel method as width goes to infinity. This equivalence opened a promising direction for applying the results of the rich literature on kernel methods to neural nets which were much harder to tackle. The present survey covers key results on kernel convergence as width goes to infinity, finite-width corrections, applications, and a discussion of the limitations of the corresponding method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源