论文标题

通过无数据知识传输进行准确的量化和修剪

Towards Accurate Quantization and Pruning via Data-free Knowledge Transfer

论文作者

Zhu, Chen, Xu, Zheng, Shafahi, Ali, Shu, Manli, Ghiasi, Amin, Goldstein, Tom

论文摘要

当有大规模训练数据可用时,可以通过量化和修剪来获得有效地部署在资源受限环境中的紧凑和准确的网络。但是,由于隐私问题,培训数据通常受到保护,并且在没有数据的情况下获得紧凑的网络是一项挑战。我们通过将知识从训练有素的大网络转移到紧凑的网络来研究无数据量化和修剪。辅助发电机是通过靶向紧凑型网络同时且对抗训练的,以生成合成输入,以最大程度地利用给定的大网络及其量化或修剪的版本之间的差异。从理论上讲,我们表明,在轻度条件下进行修剪和量化的基础最小问题的交替优化会收敛。我们的无数据紧凑型网络可实现竞争精度,以通过培训数据训练和微调的网络。我们的量化和修剪的网络在更紧凑和轻巧的同时,取得了良好的性能。此外,我们证明了彩票假设的紧凑结构和相应的初始化也可以帮助无数据培训。

When large scale training data is available, one can obtain compact and accurate networks to be deployed in resource-constrained environments effectively through quantization and pruning. However, training data are often protected due to privacy concerns and it is challenging to obtain compact networks without data. We study data-free quantization and pruning by transferring knowledge from trained large networks to compact networks. Auxiliary generators are simultaneously and adversarially trained with the targeted compact networks to generate synthetic inputs that maximize the discrepancy between the given large network and its quantized or pruned version. We show theoretically that the alternating optimization for the underlying minimax problem converges under mild conditions for pruning and quantization. Our data-free compact networks achieve competitive accuracy to networks trained and fine-tuned with training data. Our quantized and pruned networks achieve good performance while being more compact and lightweight. Further, we demonstrate that the compact structure and corresponding initialization from the Lottery Ticket Hypothesis can also help in data-free training.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源