论文标题
带有跨层排名和K-重型的最近过滤器的修剪网络
Pruning Networks with Cross-Layer Ranking & k-Reciprocal Nearest Filters
论文作者
论文摘要
本文着重于过滤级网络修剪。提出了一种新型的修剪方法,称为CLR-RNF。我们首先在基于幅度的权重修剪方法中揭示了“长尾”长尾修剪问题,然后提出了对个体体重重要性的计算意识到的测量,然后是跨层排名(CLR)的权重,以识别和去除底部的权重。因此,每层稀疏性由我们过滤器修剪的修剪网络结构组成。然后,我们引入了一个基于建议的过滤器选择方案,每个过滤器都建议一组最接近的过滤器。为了从这些推荐组中挑选保留的过滤器,我们进一步设计了一个K-对焦点最近的过滤器(RNF)选择方案,其中选定的过滤器落入这些推荐组的交集中。我们修剪过的网络结构和过滤器选择都是非学习过程,因此可以显着降低修剪的复杂性,并将我们的方法与现有作品区分开。我们在CIFAR-10和Imagenet上进行图像分类,以证明我们的CLR-RNF优于最新的。例如,在CIFAR-10上,CLR-RNF从VGGNET-16中删除了74.1%的Flops和95.0%的参数,甚至有0.3 \%的精度提高。在ImageNet上,它从Resnet-50中删除了70.2%的Flops和64.8%的参数,仅1.7%的前5个精度下降。我们的项目位于https://github.com/lmbxmu/clr-rnf。
This paper focuses on filter-level network pruning. A novel pruning method, termed CLR-RNF, is proposed. We first reveal a "long-tail" long-tail pruning problem in magnitude-based weight pruning methods, and then propose a computation-aware measurement for individual weight importance, followed by a Cross-Layer Ranking (CLR) of weights to identify and remove the bottom-ranked weights. Consequently, the per-layer sparsity makes up of the pruned network structure in our filter pruning. Then, we introduce a recommendation-based filter selection scheme where each filter recommends a group of its closest filters. To pick the preserved filters from these recommended groups, we further devise a k-Reciprocal Nearest Filter (RNF) selection scheme where the selected filters fall into the intersection of these recommended groups. Both our pruned network structure and the filter selection are non-learning processes, which thus significantly reduce the pruning complexity, and differentiate our method from existing works. We conduct image classification on CIFAR-10 and ImageNet to demonstrate the superiority of our CLR-RNF over the state-of-the-arts. For example, on CIFAR-10, CLR-RNF removes 74.1% FLOPs and 95.0% parameters from VGGNet-16 with even 0.3\% accuracy improvements. On ImageNet, it removes 70.2% FLOPs and 64.8% parameters from ResNet-50 with only 1.7% top-5 accuracy drops. Our project is at https://github.com/lmbxmu/CLR-RNF.