论文标题

BS-NAS:使用可搜索的频道数量扩大和摔断的单发NA

BS-NAS: Broadening-and-Shrinking One-Shot NAS with Searchable Numbers of Channels

论文作者

Shen, Zan, Qian, Jiang, Zhuang, Bojin, Wang, Shaojun, Xiao, Jing

论文摘要

由于重量共享和单次训练超网,单发方法已演变为神经体系结构搜索(NAS)中最流行的方法之一。但是,现有方法通常遇到两个问题:每一层中的预定数量的通道数,这是次优的;以及模型平均效应和由重量耦合以及不断扩大搜索空间引起的排名相关性差。为了明确解决这些问题,在本文中,提出了扩大和摔跤的一声NAS(BS-NAS)框架,其中“扩展”是指通过弹簧块扩大搜索空间,从而在训练超网中搜索渠道数量。 “收缩”是指一种新颖的缩水策略逐渐关闭那些表现不佳的操作。上述创新扩大了更广泛表示的搜索空间,然后通过逐渐删除表现不佳的操作来缩小它,然后是进化算法,以有效地搜索最佳体系结构。对成像网的广泛实验说明了拟议的BS-NAS以及最先进的性能的有效性。

One-Shot methods have evolved into one of the most popular methods in Neural Architecture Search (NAS) due to weight sharing and single training of a supernet. However, existing methods generally suffer from two issues: predetermined number of channels in each layer which is suboptimal; and model averaging effects and poor ranking correlation caused by weight coupling and continuously expanding search space. To explicitly address these issues, in this paper, a Broadening-and-Shrinking One-Shot NAS (BS-NAS) framework is proposed, in which `broadening' refers to broadening the search space with a spring block enabling search for numbers of channels during training of the supernet; while `shrinking' refers to a novel shrinking strategy gradually turning off those underperforming operations. The above innovations broaden the search space for wider representation and then shrink it by gradually removing underperforming operations, followed by an evolutionary algorithm to efficiently search for the optimal architecture. Extensive experiments on ImageNet illustrate the effectiveness of the proposed BS-NAS as well as the state-of-the-art performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源