论文标题

通过模型转换的灵活可微分优化

Flexible Differentiable Optimization via Model Transformations

论文作者

Besançon, Mathieu, Garcia, Joaquim Dias, Legat, Benoît, Sharma, Akshay

论文摘要

我们介绍了Julia库Diffopt.jl,以相对于目标和/或约束中存在的任意参数的优化问题解决方案。该库建立在数学上的基础上,因此利用了富集的求解器的生态系统,并使用诸如跳跃之类的建模语言构成。 Diffopt提供了向前和反向分化模式,从而使多个用例从高参数优化到反向传播和灵敏度分析,桥接受约束优化的优化和端到端可区分的编程。 Diffopt建立在两个已知规则上,用于区分二次编程和圆锥编程标准表格。但是,感谢通过模型转换来区分的能力,用户不仅限于这些形式,并且可以相对于任何可以将其重新重新重新归为这些标准形式的模型的参数进行区分。这值得注意的是混合载体圆锥约束和凸二次约束或目标函数的程序。

We introduce DiffOpt.jl, a Julia library to differentiate through the solution of optimization problems with respect to arbitrary parameters present in the objective and/or constraints. The library builds upon MathOptInterface, thus leveraging the rich ecosystem of solvers and composing well with modeling languages like JuMP. DiffOpt offers both forward and reverse differentiation modes, enabling multiple use cases from hyperparameter optimization to backpropagation and sensitivity analysis, bridging constrained optimization with end-to-end differentiable programming. DiffOpt is built on two known rules for differentiating quadratic programming and conic programming standard forms. However, thanks ability to differentiate through model transformation, the user is not limited to these forms and can differentiate with respect to the parameters of any model that can be reformulated into these standard forms. This notably includes programs mixing affine conic constraints and convex quadratic constraints or objective function.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源