论文标题

过滤器中的修剪过滤器

Pruning Filter in Filter

论文作者

Meng, Fanxu, Cheng, Hao, Li, Ke, Luo, Huixiang, Guo, Xiaowei, Lu, Guangming, Sun, Xing

论文摘要

修剪已成为一种非常有效和有效的技术,可以压缩和加速现代神经网络。现有的修剪方法可以分为两类:过滤器修剪(FP)和重量修剪(WP)。 FP以硬件兼容性获胜,但与WP相比以压缩率输了。为了融合这两种方法的强度,我们建议将过滤器中的过滤器修剪。具体而言,我们将过滤器$ f \在\ mathbb {r}^{c \ times k \ times k} $中作为$ k \ times k $ stripes,即$ 1 \ times 1 $ filters $ filters $ \ in \ mathbb {r}^{c} $,然后通过限制限制整个限制范围的范围。我们将方法称为SWP(\ emph {Stripe-Wise Pruning})。 SWP是通过引入一个称为滤波器骨架的新型可学习矩阵来实现的,该基质的值反映了每个滤镜的形状。由于最近的一些工作表明,修剪的结构比继承的重要权重更为重要,我们认为单个滤镜的体系结构,即形状,也很重要。通过广泛的实验,我们证明了与以前的基于FP的方法相比,SWP更有效,并且在CIFAR-10和Imagenet数据集上达到了最新的修剪比,而无需明显的准确性下降。代码可在https://github.com/fxmeng/pruning-filter-in-filter上获得

Pruning has become a very powerful and effective technique to compress and accelerate modern neural networks. Existing pruning methods can be grouped into two categories: filter pruning (FP) and weight pruning (WP). FP wins at hardware compatibility but loses at the compression ratio compared with WP. To converge the strength of both methods, we propose to prune the filter in the filter. Specifically, we treat a filter $F \in \mathbb{R}^{C\times K\times K}$ as $K \times K$ stripes, i.e., $1\times 1$ filters $\in \mathbb{R}^{C}$, then by pruning the stripes instead of the whole filter, we can achieve finer granularity than traditional FP while being hardware friendly. We term our method as SWP (\emph{Stripe-Wise Pruning}). SWP is implemented by introducing a novel learnable matrix called Filter Skeleton, whose values reflect the shape of each filter. As some recent work has shown that the pruned architecture is more crucial than the inherited important weights, we argue that the architecture of a single filter, i.e., the shape, also matters. Through extensive experiments, we demonstrate that SWP is more effective compared to the previous FP-based methods and achieves the state-of-art pruning ratio on CIFAR-10 and ImageNet datasets without obvious accuracy drop. Code is available at https://github.com/fxmeng/Pruning-Filter-in-Filter

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源