论文标题

基于高斯过程的模型预测控制的零级优化

Zero-Order Optimization for Gaussian Process-based Model Predictive Control

论文作者

Lahr, Amon, Zanelli, Andrea, Carron, Andrea, Zeilinger, Melanie N.

论文摘要

通过启用约束在线模型适应,使用高斯流程(GP)回归的模型预测控制在现实世界应用中表现出了令人印象深刻的性能,并在基于学习的控制社区中受到了极大的关注。然而,由于i)优化问题中增强状态的数量增加,以及ii)ii)对后平均值和协方差及其各自的衍生物的计算昂贵评估,因此解决最佳控制问题通常仍然是一个重大挑战。为了应对这些挑战,我们采用i)在顺序二次编程(SQP)方法中量身定制的雅各布近似,并将其与ii)结合使用。 Reducing the numerical complexity with respect to the state dimension $n_x$ for each SQP iteration from $\mathcal{O}(n_x^6)$ to $\mathcal{O}(n_x^3)$, and accelerating GP evaluations on a graphical processing unit, the proposed algorithm computes suboptimal, yet feasible solutions at drastically reduced computation times并具有有利的局部收敛性。数值实验验证了缩放特性,并研究了算法不同部分的运行时分布。

By enabling constraint-aware online model adaptation, model predictive control using Gaussian process (GP) regression has exhibited impressive performance in real-world applications and received considerable attention in the learning-based control community. Yet, solving the resulting optimal control problem in real-time generally remains a major challenge, due to i) the increased number of augmented states in the optimization problem, as well as ii) computationally expensive evaluations of the posterior mean and covariance and their respective derivatives. To tackle these challenges, we employ i) a tailored Jacobian approximation in a sequential quadratic programming (SQP) approach, and combine it with ii) a parallelizable GP inference and automatic differentiation framework. Reducing the numerical complexity with respect to the state dimension $n_x$ for each SQP iteration from $\mathcal{O}(n_x^6)$ to $\mathcal{O}(n_x^3)$, and accelerating GP evaluations on a graphical processing unit, the proposed algorithm computes suboptimal, yet feasible solutions at drastically reduced computation times and exhibits favorable local convergence properties. Numerical experiments verify the scaling properties and investigate the runtime distribution across different parts of the algorithm.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源