论文标题

Hyperstar:深网的任务意识超参数

HyperSTAR: Task-Aware Hyperparameters for Deep Networks

论文作者

Mittal, Gaurav, Liu, Chang, Karianakis, Nikolaos, Fragoso, Victor, Chen, Mei, Fu, Yun

论文摘要

虽然深层神经网络在解决视觉识别任务方面表现出色,但他们需要大量的努力来找到使它们最佳工作的超参数。超参数优化(HPO)方法已经自动化了找到良好的超参数的过程,但它们不适应给定的任务(任务无关),从而使它们在计算上效率低下。为了减少HPO时间,我们提出Hyperstar(任务意识到的超参数建议系统),这是一种为深神经网络启动HPO的任务感知方法。 Hyperstar通过预测其在联合数据集杂音参数空间上预测其性能来排名并推荐超参数。它以端到端的方式从原始图像中直接从原始图像中学习了数据集(任务)表示形式。这些建议与现有的HPO方法集成后,使其成为任务了解,并显着减少实现最佳性能的时间。我们对两种不同的网络体系结构上的10个公开可用的大规模图像分类数据集进行了广泛的实验,从而验证了Hyperstar评估的配置少50%,以达到与现有方法相比的最佳性能。我们进一步证明了Hyperstar使HyperBand(HB)任务了解,从而在Vanilla HB和Bayesian优化HB〜(BOHB)所需的预算仅25%中实现了最佳准确性。

While deep neural networks excel in solving visual recognition tasks, they require significant effort to find hyperparameters that make them work optimally. Hyperparameter Optimization (HPO) approaches have automated the process of finding good hyperparameters but they do not adapt to a given task (task-agnostic), making them computationally inefficient. To reduce HPO time, we present HyperSTAR (System for Task Aware Hyperparameter Recommendation), a task-aware method to warm-start HPO for deep neural networks. HyperSTAR ranks and recommends hyperparameters by predicting their performance conditioned on a joint dataset-hyperparameter space. It learns a dataset (task) representation along with the performance predictor directly from raw images in an end-to-end fashion. The recommendations, when integrated with an existing HPO method, make it task-aware and significantly reduce the time to achieve optimal performance. We conduct extensive experiments on 10 publicly available large-scale image classification datasets over two different network architectures, validating that HyperSTAR evaluates 50% less configurations to achieve the best performance compared to existing methods. We further demonstrate that HyperSTAR makes Hyperband (HB) task-aware, achieving the optimal accuracy in just 25% of the budget required by both vanilla HB and Bayesian Optimized HB~(BOHB).

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源