论文标题
无邻近的神经体系结构适应监督学习和自学学习
Proxyless Neural Architecture Adaptation for Supervised Learning and Self-Supervised Learning
论文作者
论文摘要
最近,引入了神经体系结构搜索(NAS)方法,并在许多基准测试中表现出令人印象深刻的性能。在NAS研究中,神经体系结构变压器(NAT)旨在调整给定的神经体系结构,以提高性能,同时保持计算成本。但是,NAT缺乏可重复性,并且需要在网络重量训练之前进行额外的体系结构适应过程。在本文中,我们提出了可再现和高效的无近近近临界神经结构适应。我们的方法可以应用于监督的学习和自我监督的学习。提出的方法在各种体系结构上显示出稳定的性能。在两个数据集(即CIFAR-10和Tiny Imagenet)上进行的广泛可重复性实验表明该建议的方法肯定胜过NAT,并且适用于其他模型和数据集。
Recently, Neural Architecture Search (NAS) methods have been introduced and show impressive performance on many benchmarks. Among those NAS studies, Neural Architecture Transformer (NAT) aims to adapt the given neural architecture to improve performance while maintaining computational costs. However, NAT lacks reproducibility and it requires an additional architecture adaptation process before network weight training. In this paper, we propose proxyless neural architecture adaptation that is reproducible and efficient. Our method can be applied to both supervised learning and self-supervised learning. The proposed method shows stable performance on various architectures. Extensive reproducibility experiments on two datasets, i.e., CIFAR-10 and Tiny Imagenet, present that the proposed method definitely outperforms NAT and is applicable to other models and datasets.