论文标题

学习易于解决的微分方程

Learning Differential Equations that are Easy to Solve

论文作者

Kelly, Jacob, Bettencourt, Jesse, Johnson, Matthew James, Duvenaud, David

论文摘要

随着培训的进行,通过神经网络参数参数求解的微分方程变得昂贵。我们提出了一种鼓励学习动态更容易解决的补救措施。具体而言,我们使用溶液轨迹的高阶导数引入了标准数值求解器的时间成本的可区分替代物。这些衍生物可以有效地通过泰勒模式自动分化来计算。优化此额外的目标交易模型性能与解决学习动态的时间成本。我们通过更快地训练来证明我们的方法,同时几乎准确地进行了监督分类,密度估计和时间序列建模任务的模型。

Differential equations parameterized by neural networks become expensive to solve numerically as training progresses. We propose a remedy that encourages learned dynamics to be easier to solve. Specifically, we introduce a differentiable surrogate for the time cost of standard numerical solvers, using higher-order derivatives of solution trajectories. These derivatives are efficient to compute with Taylor-mode automatic differentiation. Optimizing this additional objective trades model performance against the time cost of solving the learned dynamics. We demonstrate our approach by training substantially faster, while nearly as accurate, models in supervised classification, density estimation, and time-series modelling tasks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源