论文标题

基于图同构的性能预测指标的架构增强

Architecture Augmentation for Performance Predictor Based on Graph Isomorphism

论文作者

Xie, Xiangning, Liu, Yuqiao, Sun, Yanan, Zhang, Mengjie, Tan, Kay Chen

论文摘要

神经体系结构搜索(NAS)可以自动为深神经网络(DNN)设计架构,并已成为当前机器学习社区中最热门的研究主题之一。但是,NAS通常在计算上很昂贵,因为在搜索过程中需要培训大量DNN。绩效预测因素可以通过直接预测DNN的性能来大大减轻NAS的过失成本。但是,构建令人满意的性能预测器很大程度上取决于训练有素的DNN体系结构,在大多数情况下很难获得。为了解决这个关键问题,我们在本文中提出了一种有效的DNN体系结构增强方法。具体而言,我们首先提出了一种基于图同构的机制,其优点是有效地生成$ \ boldsymbol n $(即$ \ boldsymbol n!$)的阶乘,对具有$ \ boldsymbol n $ nodes的单个体系结构进行了带注释的体系结构。此外,我们还设计了一种通用方法,将体系结构编码为适合大多数预测模型的形式。结果,可以通过各种基于性能预测因子的NAS算法灵活地利用Giaug。我们在CIFAR-10和ImageNet基准数据集上进行了广泛的实验。实验表明,Giaug可以显着提高大多数最先进的同伴预测因子的性能。此外,与最先进的NAS算法相比,Giaug最多可以在ImageNet上节省三级计算成本。

Neural Architecture Search (NAS) can automatically design architectures for deep neural networks (DNNs) and has become one of the hottest research topics in the current machine learning community. However, NAS is often computationally expensive because a large number of DNNs require to be trained for obtaining performance during the search process. Performance predictors can greatly alleviate the prohibitive cost of NAS by directly predicting the performance of DNNs. However, building satisfactory performance predictors highly depends on enough trained DNN architectures, which are difficult to obtain in most scenarios. To solve this critical issue, we propose an effective DNN architecture augmentation method named GIAug in this paper. Specifically, we first propose a mechanism based on graph isomorphism, which has the merit of efficiently generating a factorial of $\boldsymbol n$ (i.e., $\boldsymbol n!$) diverse annotated architectures upon a single architecture having $\boldsymbol n$ nodes. In addition, we also design a generic method to encode the architectures into the form suitable to most prediction models. As a result, GIAug can be flexibly utilized by various existing performance predictors-based NAS algorithms. We perform extensive experiments on CIFAR-10 and ImageNet benchmark datasets on small-, medium- and large-scale search space. The experiments show that GIAug can significantly enhance the performance of most state-of-the-art peer predictors. In addition, GIAug can save three magnitude order of computation cost at most on ImageNet yet with similar performance when compared with state-of-the-art NAS algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源