论文标题

有限神经网络的功能空间分析,具有来自抽样理论的见解

A function space analysis of finite neural networks with insights from sampling theory

论文作者

Giryes, Raja

论文摘要

这项工作建议使用抽样理论分析由神经网络代表的功能空间。首先,它在训练神经网络中的常见情况下表明,在有限输入域的假设下,具有非表达激活功能的多层网络生成的函数空间是平稳的。这扩展了以前的作品,该作品显示了无限宽度relu网络的结果。然后,在假设输入是带限制的假设下,我们为单变量神经网络提供了新的误差界限。我们分析了确定性统一和随机抽样,显示了前者的优势。

This work suggests using sampling theory to analyze the function space represented by neural networks. First, it shows, under the assumption of a finite input domain, which is the common case in training neural networks, that the function space generated by multi-layer networks with non-expansive activation functions is smooth. This extends over previous works that show results for the case of infinite width ReLU networks. Then, under the assumption that the input is band-limited, we provide novel error bounds for univariate neural networks. We analyze both deterministic uniform and random sampling showing the advantage of the former.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源