论文标题

过度参数化卷积残留网络的好处:在平滑度约束下的功能近似值

Benefits of Overparameterized Convolutional Residual Networks: Function Approximation under Smoothness Constraint

论文作者

Liu, Hao, Chen, Minshuo, Er, Siawpeng, Liao, Wenjing, Zhang, Tong, Zhao, Tuo

论文摘要

过度参数化的神经网络在复杂数据上具有很大的代表能力,更重要的是产生足够光滑的输出,这对于它们的概括和鲁棒性至关重要。大多数现有的函数近似理论表明,使用足够多的参数,神经网络可以很好地近似某些类别的函数,以函数值。但是,神经网络本身可能是高度平滑的。为了弥合这一差距,我们以卷积残差网络(汇总)为例,并证明大型响应不仅可以根据功能值近似目标函数,而且还可以表现出足够的一阶平滑度。此外,我们将理论扩展到在低维歧管上支持的近似功能。我们的理论部分证明了在实践中使用深层网络的好处。提供了有关对抗性鲁棒图像分类的数值实验,以支持我们的理论。

Overparameterized neural networks enjoy great representation power on complex data, and more importantly yield sufficiently smooth output, which is crucial to their generalization and robustness. Most existing function approximation theories suggest that with sufficiently many parameters, neural networks can well approximate certain classes of functions in terms of the function value. The neural network themselves, however, can be highly nonsmooth. To bridge this gap, we take convolutional residual networks (ConvResNets) as an example, and prove that large ConvResNets can not only approximate a target function in terms of function value, but also exhibit sufficient first-order smoothness. Moreover, we extend our theory to approximating functions supported on a low-dimensional manifold. Our theory partially justifies the benefits of using deep and wide networks in practice. Numerical experiments on adversarial robust image classification are provided to support our theory.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源