论文标题

通过自适应稀疏性损失修剪体重

Weight Pruning via Adaptive Sparsity Loss

论文作者

Retsinas, George, Elafrou, Athena, Goumas, Georgios, Maragos, Petros

论文摘要

修剪神经网络近年来已重新获得兴趣,以此作为压缩最新深层神经网络的一种手段,并使他们在资源受限的设备上的部署。在本文中,我们提出了一个可靠的压缩学习框架,该框架在最小计算开销的培训过程中有效地修剪网络参数。我们将快速的机制纳入修剪单个层并建立在这些层的基础上,以在用户定义的预算约束下自动修剪整个网络。我们端到端网络修剪方法的关键是制定直观且易于实施的自适应稀疏性损失,该损失用于在训练过程中明确控制稀疏性,从而有效地预算意识到了优化。广泛的实验证明了使用不同的架构(包括Alexnet,Resnets和Wide Resnets)在CIFAR和Imagenet数据集上提出的图像分类框架的有效性。

Pruning neural networks has regained interest in recent years as a means to compress state-of-the-art deep neural networks and enable their deployment on resource-constrained devices. In this paper, we propose a robust compressive learning framework that efficiently prunes network parameters during training with minimal computational overhead. We incorporate fast mechanisms to prune individual layers and build upon these to automatically prune the entire network under a user-defined budget constraint. Key to our end-to-end network pruning approach is the formulation of an intuitive and easy-to-implement adaptive sparsity loss that is used to explicitly control sparsity during training, enabling efficient budget-aware optimization. Extensive experiments demonstrate the effectiveness of the proposed framework for image classification on the CIFAR and ImageNet datasets using different architectures, including AlexNet, ResNets and Wide ResNets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源