论文标题
稀疏的超规范网络
Sparse Super-Regular Networks
论文作者
论文摘要
汤姆(Thom)和棕榈(Palm)争论的是,与完全连接的网络(FCN)相互联系的神经网络(SCN)表现出改善的性能。超规范网络(SRN)是由一组(Epsilon,delta)super-super-Super-regular-regular-node Pairs和随机排列的节点顺序组成的一组稀疏层组成的神经网络。使用爆炸引理,我们证明,由于每对层的单个超定型性,SRNS保证了许多属性,这些属性使它们适合用于FCN的许多任务。这些保证包括所有大型子集的边缘均匀性,最小节点内和偏高,输入输出灵敏度以及嵌入预训练构建体的能力。实际上,SRN具有像FCN一样行动的能力,并消除了对辍学等昂贵的正规化方案的需求。我们表明,SRN的性能与X-NET相似,可以通过易于重现的实验进行,并提供了更大的保证和控制网络结构。
It has been argued by Thom and Palm that sparsely-connected neural networks (SCNs) show improved performance over fully-connected networks (FCNs). Super-regular networks (SRNs) are neural networks composed of a set of stacked sparse layers of (epsilon, delta)-super-regular pairs, and randomly permuted node order. Using the Blow-up Lemma, we prove that as a result of the individual super-regularity of each pair of layers, SRNs guarantee a number of properties that make them suitable replacements for FCNs for many tasks. These guarantees include edge uniformity across all large-enough subsets, minimum node in- and out-degree, input-output sensitivity, and the ability to embed pre-trained constructs. Indeed, SRNs have the capacity to act like FCNs, and eliminate the need for costly regularization schemes like Dropout. We show that SRNs perform similarly to X-Nets via readily reproducible experiments, and offer far greater guarantees and control over network structure.