论文标题

探索频道稀疏对神经网络修剪的影响对声学场景分类的影响

Exploring the Effects of Channel Sparsity on Neural Network Pruning for Acoustic Scene Classification

论文作者

Cai, Yiqiang, Li, Shengchen

论文摘要

声学场景分类(ASC)算法通常被期望部署在资源约束系统中。现有作品通过修剪某些组件(例如神经网络中的修剪渠道。实际上,神经网络通常经过稀疏训练,以便发现并进一步修剪不重要的渠道。但是,几乎没有努力探索频道稀疏对神经网络修剪的影响。为了充分利用修剪ASC的好处,并确保模型始终如一地执行,我们需要对渠道的稀疏及其效果有更深入的理解。本文研究了将经历修剪的卷积神经网络所获得的内部权重。该研究讨论了如何利用这些权重来创建新型度量,重量偏度(WS),以量化通道的稀疏性。我们还提供了一种新方法来比较不同修剪方法的性能,该方法与准确性和复杂性之间的权衡平衡。实验结果表明,1)将较高的频道稀疏性应用于模型可以达到更高的压缩率,同时保持可接受的准确性水平; 2)修剪方法的选择对结果的影响很小1); 3)Mobilenets比VGGNET和RESNET具有稀疏性的显着好处。

Acoustic Scene Classification (ASC) algorithms are usually expected to be deployed in resource-constrained systems. Existing works reduce the complexity of ASC algorithms by pruning some components, e.g. pruning channels in neural network. In practice, neural networks are often trained with sparsification such that unimportant channels can be found and further pruned. However, little efforts have been made to explore the the impact of channel sparsity on neural network pruning. To fully utilize the benefits of pruning for ASC, and to make sure the model performs consistently, we need a more profound comprehension of channel sparsification and its effects. This paper examines the internal weights acquired by convolutional neural networks that will undergone pruning. The study discusses how these weights can be utilized to create a novel metric, Weight Skewness (WS), for quantifying the sparsity of channels. We also provide a new approach to compare the performance of different pruning methods, which balances the trade-off between accuracy and complexity. The experiment results demonstrate that 1) applying higher channel sparsity to models can achieve greater compression rates while maintaining acceptable levels of accuracy; 2) the selection of pruning method has little influence on result 1); 3) MobileNets exhibit more significant benefits from channel sparsification than VGGNets and ResNets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源