论文标题

Amazon Sagemaker自动模型调整:可扩展的无梯度优化

Amazon SageMaker Automatic Model Tuning: Scalable Gradient-Free Optimization

论文作者

Perrone, Valerio, Shen, Huibin, Zolic, Aida, Shcherbatyi, Iaroslav, Ahmed, Amr, Bansal, Tanya, Donini, Michele, Winkelmolen, Fela, Jenatton, Rodolphe, Faddoul, Jean Baptiste, Pogorzelska, Barbara, Miladinovic, Miroslav, Kenthapadi, Krishnaram, Seeger, Matthias, Archambeau, Cédric

论文摘要

调整复杂的机器学习系统具有挑战性。机器学习通常需要设置超参数,无论是正则化,体系结构还是优化参数,其调整对于实现良好的预测性能至关重要。为了使对机器学习系统的访问民主化,自动调整至关重要。本文介绍了Amazon Sagemaker自动模型调整(AMT),这是一个完整的管理系统,用于规模无梯度优化。 AMT通过反复通过不同的超参数配置对训练有素的机器学习模型找到了最佳版本。它利用随机搜索或贝叶斯优化来选择导致最佳模型的高参数值,这是由用户选择的度量标准衡量的。 AMT可以与内置算法,自定义算法和Amazon Sagemaker预装框架一起用于机器学习框架。我们讨论核心功能,系统体系结构,我们的设计原理和经验教训。我们还描述了AMT的更先进的功能,例如自动化的早期停止和温暖启动,在实验中显示出对用户的好处。

Tuning complex machine learning systems is challenging. Machine learning typically requires to set hyperparameters, be it regularization, architecture, or optimization parameters, whose tuning is critical to achieve good predictive performance. To democratize access to machine learning systems, it is essential to automate the tuning. This paper presents Amazon SageMaker Automatic Model Tuning (AMT), a fully managed system for gradient-free optimization at scale. AMT finds the best version of a trained machine learning model by repeatedly evaluating it with different hyperparameter configurations. It leverages either random search or Bayesian optimization to choose the hyperparameter values resulting in the best model, as measured by the metric chosen by the user. AMT can be used with built-in algorithms, custom algorithms, and Amazon SageMaker pre-built containers for machine learning frameworks. We discuss the core functionality, system architecture, our design principles, and lessons learned. We also describe more advanced features of AMT, such as automated early stopping and warm-starting, showing in experiments their benefits to users.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源