论文标题

有限宽度神经网络的无限深度极限

On the infinite-depth limit of finite-width neural networks

论文作者

Hayou, Soufiane

论文摘要

在本文中,我们研究了具有随机高斯重量的有限宽度残留神经网络的无限深度。通过适当的缩放,我们表明,通过固定宽度并将深度提高到无穷大,前激活的分布会融合到零饮用扩散过程中。与无限宽度极限不同,而前激活薄弱地收敛到高斯随机变量不同,我们表明,无限深度极限会根据激活函数的选择而产生不同的分布。我们记录了这些分布具有封闭形式(不同)表达式的两个情况。当宽度从3增加到4时,我们进一步显示了激活后的政权现象的有趣变化。

In this paper, we study the infinite-depth limit of finite-width residual neural networks with random Gaussian weights. With proper scaling, we show that by fixing the width and taking the depth to infinity, the pre-activations converge in distribution to a zero-drift diffusion process. Unlike the infinite-width limit where the pre-activation converge weakly to a Gaussian random variable, we show that the infinite-depth limit yields different distributions depending on the choice of the activation function. We document two cases where these distributions have closed-form (different) expressions. We further show an intriguing change of regime phenomenon of the post-activation norms when the width increases from 3 to 4. Lastly, we study the sequential limit infinite-depth-then-infinite-width and compare it with the more commonly studied infinite-width-then-infinite-depth limit.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源