论文标题

LookAhead:一种基于幅度的修剪的遥远替代品

Lookahead: A Far-Sighted Alternative of Magnitude-based Pruning

论文作者

Park, Sejun, Lee, Jaeho, Mo, Sangwoo, Shin, Jinwoo

论文摘要

基于大小的修剪是修剪神经网络的最简单方法之一。尽管它简单,但基于幅度的修剪及其变体仍表现出非常出色的修剪现代体系结构的表现。基于这样的观察,基于幅度的修剪确实可以最大程度地减少与单层相对应的线性操作员的Frobenius失真,我们通过将单层优化扩展到多层优化的方式来开发一种简单的修剪方法,即造成的lookahead修剪。我们的实验结果表明,所提出的方法在包括VGG和Resnet在内的各种网络上始终优于基于幅度的修剪,尤其是在高表象方案中。有关代码,请参见https://github.com/alinlab/lookahead_pruning。

Magnitude-based pruning is one of the simplest methods for pruning neural networks. Despite its simplicity, magnitude-based pruning and its variants demonstrated remarkable performances for pruning modern architectures. Based on the observation that magnitude-based pruning indeed minimizes the Frobenius distortion of a linear operator corresponding to a single layer, we develop a simple pruning method, coined lookahead pruning, by extending the single layer optimization to a multi-layer optimization. Our experimental results demonstrate that the proposed method consistently outperforms magnitude-based pruning on various networks, including VGG and ResNet, particularly in the high-sparsity regime. See https://github.com/alinlab/lookahead_pruning for codes.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源