论文标题
对超网的优化及其应用的调查:用于神经体系结构搜索的空间和时间优化
A Survey of Supernet Optimization and its Applications: Spatial and Temporal Optimization for Neural Architecture Search
论文作者
论文摘要
这项调查着重于对神经架构搜索(NAS)领域中超网的优化方法进行分类和评估。 SuperNet优化涉及培训单个,过度参数化的网络,该网络涵盖了所有可能的网络体系结构的搜索空间。该调查基于它们的空间和时间优化方法分析了超网的优化方法。空间优化与优化超级网及其子网的体系结构和参数有关,而时间优化则涉及提高从超级网络选择体系结构的效率。还讨论了这些方法在各种任务和设置中的好处,局限性和潜在应用,包括可传递性,域概括和变压器模型。
This survey focuses on categorizing and evaluating the methods of supernet optimization in the field of Neural Architecture Search (NAS). Supernet optimization involves training a single, over-parameterized network that encompasses the search space of all possible network architectures. The survey analyses supernet optimization methods based on their approaches to spatial and temporal optimization. Spatial optimization relates to optimizing the architecture and parameters of the supernet and its subnets, while temporal optimization deals with improving the efficiency of selecting architectures from the supernet. The benefits, limitations, and potential applications of these methods in various tasks and settings, including transferability, domain generalization, and Transformer models, are also discussed.