论文标题
通过具有乙状结激活功能的深神经网络近似平滑功能
Approximating smooth functions by deep neural networks with sigmoid activation function
论文作者
论文摘要
我们研究具有乙状结激活函数的深神经网络(DNN)的功能。最近,显示DNN大约在紧凑型设置上近似任何$ d $ d $ d $平滑的功能,其订单率$ w^{ - p/d} $,其中$ w $是网络中的非零权重的数量,$ p $是该函数的平滑度。不幸的是,这些速率仅适用于特殊的一类稀疏连接的DNN。我们问自己是否可以显示出更简单,更一般类别的近似率,即仅通过其宽度和深度来定义的DNN。在本文中,我们显示了固定深度和订单宽度$ m^d $的DNN的近似值为$ m^{ - 2p} $。作为结论,我们定量地以网络中的整体权重$ W_0 $来定量表征DNN的近似功率,并显示$ W_0^{ - p/d} $的近似速率。最终的结果最终有助于我们了解哪种网络拓扑确保了特殊的目标准确性。
We study the power of deep neural networks (DNNs) with sigmoid activation function. Recently, it was shown that DNNs approximate any $d$-dimensional, smooth function on a compact set with a rate of order $W^{-p/d}$, where $W$ is the number of nonzero weights in the network and $p$ is the smoothness of the function. Unfortunately, these rates only hold for a special class of sparsely connected DNNs. We ask ourselves if we can show the same approximation rate for a simpler and more general class, i.e., DNNs which are only defined by its width and depth. In this article we show that DNNs with fixed depth and a width of order $M^d$ achieve an approximation rate of $M^{-2p}$. As a conclusion we quantitatively characterize the approximation power of DNNs in terms of the overall weights $W_0$ in the network and show an approximation rate of $W_0^{-p/d}$. This more general result finally helps us to understand which network topology guarantees a special target accuracy.