论文标题

通过堆叠设计的基本单元,紧凑的神经网络

Compact Neural Networks via Stacking Designed Basic Units

论文作者

Lan, Weichao, Cheung, Yiu-ming, Jiang, Juyong

论文摘要

非结构化的修剪具有处理稀疏和不规则重量的局限性。相比之下,结构化的修剪可以帮助消除此缺点,但需要复杂的标准来确定要修剪哪些组件。为此,本文提出了一种称为TissueNet的新方法,该方法通过独立堆叠设计的基本单位而直接构建具有更少重量参数的紧凑型神经网络,而无需再需要其他判断标准。鉴于各种架构的基本单元,它们以某种形式组合和堆叠以建立紧凑的神经网络。我们在不同的流行骨架中制定了组织曲线,以与不同基准数据集上的最先进的修剪方法进行比较。此外,提出了两个新的指标来评估压缩性能。实验结果表明,组织赛车可以达到可比的分类准确性,同时节省了大约80%的失败和89.7%的参数。也就是说,堆叠基本单元为网络压缩提供了一种新的有希望的方法。

Unstructured pruning has the limitation of dealing with the sparse and irregular weights. By contrast, structured pruning can help eliminate this drawback but it requires complex criterion to determine which components to be pruned. To this end, this paper presents a new method termed TissueNet, which directly constructs compact neural networks with fewer weight parameters by independently stacking designed basic units, without requiring additional judgement criteria anymore. Given the basic units of various architectures, they are combined and stacked in a certain form to build up compact neural networks. We formulate TissueNet in diverse popular backbones for comparison with the state-of-the-art pruning methods on different benchmark datasets. Moreover, two new metrics are proposed to evaluate compression performance. Experiment results show that TissueNet can achieve comparable classification accuracy while saving up to around 80% FLOPs and 89.7% parameters. That is, stacking basic units provides a new promising way for network compression.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源