论文标题

使用深神经网络计算Lyapunov功能

Computing Lyapunov functions using deep neural networks

论文作者

Grüne, Lars

论文摘要

我们提出了一种深层神经网络体系结构和一种用于计算非线性普通微分方程系统近似Lyapunov函数的训练算法。在该系统允许组成lyapunov函数的假设下,我们证明,具有固定精度的Lyapunov函数所需的神经元数仅在状态维度上多种多样地增长,即,所提出的方法能够克服维度的诅咒。我们表明,满足小增益条件的非线性系统允许构成Lyapunov函数。多达十个空间维度的数值示例说明了训练方案的性能。

We propose a deep neural network architecture and a training algorithm for computing approximate Lyapunov functions of systems of nonlinear ordinary differential equations. Under the assumption that the system admits a compositional Lyapunov function, we prove that the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality. We show that nonlinear systems satisfying a small-gain condition admit compositional Lyapunov functions. Numerical examples in up to ten space dimensions illustrate the performance of the training scheme.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源