论文标题

有条件的神经建筑搜索

Conditional Neural Architecture Search

论文作者

Kao, Sheng-Chun, Ramamurthy, Arun, Williams, Reed, Krishna, Tushar

论文摘要

设计资源有效的深神经网络(DNN)对于由于性能,功率和记忆预算的多样化而在边缘平台上部署深度学习解决方案至关重要。不幸的是,训练有素的ML模型通常不符合部署边缘平台的约束,从而导致模型降低和重新培训过程的长时间。此外,当我们在另一个平台-B上部署时,对平台优化的ML模型通常不合适,从而导致了模型再培训的另一种迭代。我们建议使用GAN进行条件神经体系结构搜索方法,该方法为不同平台生成可行的ML模型。我们提出了一个新的工作流程,以生成约束优化的DNN模型。这是将条件和对抗技术带入神经体系结构搜索领域的第一项工作。我们通过回归问题和CIFAR-10分类来验证该方法。提出的工作流可以成功生成资源优化的MLP或基于CNN的网络。

Designing resource-efficient Deep Neural Networks (DNNs) is critical to deploy deep learning solutions over edge platforms due to diverse performance, power, and memory budgets. Unfortunately, it is often the case a well-trained ML model does not fit to the constraint of deploying edge platforms, causing a long iteration of model reduction and retraining process. Moreover, a ML model optimized for platform-A often may not be suitable when we deploy it on another platform-B, causing another iteration of model retraining. We propose a conditional neural architecture search method using GAN, which produces feasible ML models for different platforms. We present a new workflow to generate constraint-optimized DNN models. This is the first work of bringing in condition and adversarial technique into Neural Architecture Search domain. We verify the method with regression problems and classification on CIFAR-10. The proposed workflow can successfully generate resource-optimized MLP or CNN-based networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源