论文标题

动态推理的渐进子网络搜索框架

A Progressive Sub-Network Searching Framework for Dynamic Inference

论文作者

Yang, Li, He, Zhezhi, Cao, Yu, Fan, Deliang

论文摘要

已经开发了许多技术,例如模型压缩,以更有效地使深度神经网络(DNN)推断。然而,DNN仍然缺乏出色的运行时动态推理能力,无法根据动态要求和环境,使用户能够权衡准确性和计算复杂性(即目标硬件的延迟)。这种研究方向最近引起了极大的关注,其中一个实现是通过多项目标函数训练目标DNN,该目标函数由多个子网络的跨透明术语组成。我们在这项工作中的调查表明,动态推断的性能高度依赖于子网络采样的质量。我们的目标是构建动态的DNN并以最低的搜索成本进行搜索多个高质量的子网络,我们提出了一个渐进的子网络搜索框架,该框架嵌入了几种有效的技术,包括可训练的噪声排名,频道组和微调阈值阈值设置,子网重新选择。所提出的框架可以通过在不同网络结构上进行全面实验,使目标DNN具有更好的动态推理能力,以优于先前的CIFAR-10和Imagenet数据集。以RESNET18为例,我们所提出的方法可以在具有相同模型大小的Imagenet数据集中达到4.4%的普遍可容纳网络,从而获得了更好的动态推理精度。

Many techniques have been developed, such as model compression, to make Deep Neural Networks (DNNs) inference more efficiently. Nevertheless, DNNs still lack excellent run-time dynamic inference capability to enable users trade-off accuracy and computation complexity (i.e., latency on target hardware) after model deployment, based on dynamic requirements and environments. Such research direction recently draws great attention, where one realization is to train the target DNN through a multiple-term objective function, which consists of cross-entropy terms from multiple sub-nets. Our investigation in this work show that the performance of dynamic inference highly relies on the quality of sub-net sampling. With objective to construct a dynamic DNN and search multiple high quality sub-nets with minimal searching cost, we propose a progressive sub-net searching framework, which is embedded with several effective techniques, including trainable noise ranking, channel group and fine-tuning threshold setting, sub-nets re-selection. The proposed framework empowers the target DNN with better dynamic inference capability, which outperforms prior works on both CIFAR-10 and ImageNet dataset via comprehensive experiments on different network structures. Taken ResNet18 as an example, our proposed method achieves much better dynamic inference accuracy compared with prior popular Universally-Slimmable-Network by 4.4%-maximally and 2.3%-averagely in ImageNet dataset with the same model size.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源