论文标题
使用自适应连通性评分减少神经网络
Slimming Neural Networks using Adaptive Connectivity Scores
论文作者
论文摘要
通常,深度神经网络(DNN)修剪方法分为两类:1)基于权重的确定性约束,以及2)概率框架。尽管每种方法都有其优点和局限性,但仍有一系列常见的实际问题,例如,试用和错误分析敏感性和对修剪DNNS的超参数,这两者都困扰着它们。在这项工作中,我们提出了一种使用自适应连接得分(SNACS)的新的单发,完全自动的修剪算法,称为Slimming神经网络。我们提出的方法将概率的修剪框架与对基础重量矩阵的限制结合在一起,通过一种新的连接度量,在多个层面上可以利用这两种方法的优势,同时解决它们的缺陷。在\ alg {}中,我们提出了使用基于重量的缩放标准的自适应条件相互信息(ACMI)的快速基于哈希的估计量,以评估过滤器与修剪量不重要的连通性。为了自动确定可以修剪层的极限,我们提出了一组操作约束,这些约束共同定义了深网中所有层上所有图层的上限百分比限制。最后,我们为过滤器定义了一种新颖的灵敏度标准,该标准衡量了它们对成功层的贡献的强度,并突出了需要完全保护不受修剪的关键过滤器。通过我们的实验验证,我们表明,SNAC的速度超过17倍,是最接近的可比方法,是三种标准数据集DNN修剪基准的最先进的单次修剪方法:CIFAR10-VGG16,CIFAR10-VGG16,CIFAR10-RESNET56和ILSVRCRC2012-RESNET50。
In general, deep neural network (DNN) pruning methods fall into two categories: 1) Weight-based deterministic constraints, and 2) Probabilistic frameworks. While each approach has its merits and limitations there are a set of common practical issues such as, trial-and-error to analyze sensitivity and hyper-parameters to prune DNNs, which plague them both. In this work, we propose a new single-shot, fully automated pruning algorithm called Slimming Neural networks using Adaptive Connectivity Scores (SNACS). Our proposed approach combines a probabilistic pruning framework with constraints on the underlying weight matrices, via a novel connectivity measure, at multiple levels to capitalize on the strengths of both approaches while solving their deficiencies. In \alg{}, we propose a fast hash-based estimator of Adaptive Conditional Mutual Information (ACMI), that uses a weight-based scaling criterion, to evaluate the connectivity between filters and prune unimportant ones. To automatically determine the limit up to which a layer can be pruned, we propose a set of operating constraints that jointly define the upper pruning percentage limits across all the layers in a deep network. Finally, we define a novel sensitivity criterion for filters that measures the strength of their contributions to the succeeding layer and highlights critical filters that need to be completely protected from pruning. Through our experimental validation we show that SNACS is faster by over 17x the nearest comparable method and is the state of the art single-shot pruning method across three standard Dataset-DNN pruning benchmarks: CIFAR10-VGG16, CIFAR10-ResNet56 and ILSVRC2012-ResNet50.