论文标题

修剪的稀疏正规化网络修剪

Pruning-aware Sparse Regularization for Network Pruning

论文作者

Jiang, Nanfei, Zhao, Xu, Zhao, Chaoyang, An, Yongqi, Tang, Ming, Wang, Jinqiao

论文摘要

结构神经网络修剪旨在通过修剪对最终输出准确性的重要性较小的过滤器来消除深卷积神经网络(CNN)中的冗余通道。为了减少修剪后的性能降解,许多方法利用稀疏正则化的损失产生结构化的稀疏性。在本文中,我们分析了这些基于稀疏训练的方法,发现未经修复的通道的正则化是不必要的。此外,它限制了网络的能力,这导致了不合适的。为了解决这个问题,我们提出了一种新型的修剪方法,称为MaskSparsity,并具有稀疏的正则化。 MaskSparsity将细粒度的稀疏正则化施加在修剪面罩选择的特定过滤器上,而不是模型的所有过滤器。在蒙版的细粒度稀疏正规化之前,我们可以使用许多方法来获取修剪面膜,例如运行全局稀疏正则化。 MaskSparsity通过删除60.34%的参数可在RESNET-11上降低63.03% - 触发器,而CIFAR-10上没有TOP-1的准确性损失。在ILSVRC-2012上,BumaskSparsity在Resnet-50上降低了超过51.07%的拖鞋,在TOP-1的准确性中仅损失0.76%。 该代码在https://github.com/casia-iva-lab/masksparsity上发布。此外,我们已经将MaskSparity的代码集成到Pytorch修剪工具包中,EasyPruner,https://gitee.com/casia_iva_engineer/easypruner。

Structural neural network pruning aims to remove the redundant channels in the deep convolutional neural networks (CNNs) by pruning the filters of less importance to the final output accuracy. To reduce the degradation of performance after pruning, many methods utilize the loss with sparse regularization to produce structured sparsity. In this paper, we analyze these sparsity-training-based methods and find that the regularization of unpruned channels is unnecessary. Moreover, it restricts the network's capacity, which leads to under-fitting. To solve this problem, we propose a novel pruning method, named MaskSparsity, with pruning-aware sparse regularization. MaskSparsity imposes the fine-grained sparse regularization on the specific filters selected by a pruning mask, rather than all the filters of the model. Before the fine-grained sparse regularization of MaskSparity, we can use many methods to get the pruning mask, such as running the global sparse regularization. MaskSparsity achieves 63.03%-FLOPs reduction on ResNet-110 by removing 60.34% of the parameters, with no top-1 accuracy loss on CIFAR-10. On ILSVRC-2012, MaskSparsity reduces more than 51.07% FLOPs on ResNet-50, with only a loss of 0.76% in the top-1 accuracy. The code is released at https://github.com/CASIA-IVA-Lab/MaskSparsity. Moreover, we have integrated the code of MaskSparity into a PyTorch pruning toolkit, EasyPruner, at https://gitee.com/casia_iva_engineer/easypruner.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源