论文标题

深盖尔金方法和深层丽思方法的比较研究,用于具有不同边界条件的椭圆问题

A comparison study of deep Galerkin method and deep Ritz method for elliptic problems with different boundary conditions

论文作者

Chen, Jingrun, Du, Rui, Wu, Keke

论文摘要

近年来,通过深度神经网络,尤其是在高维情况下,对解决部分微分方程的兴趣越来越大。与经典的数值方法(例如有限差异方法和有限元方法)不同,深度神经网络中边界条件的执行是高度不平凡的。一种一般策略是使用惩罚方法。在工作中,我们使用两种代表性的方法:深盖尔金方法和深里兹方法,对具有四个不同边界条件的椭圆问题进行了比较研究。在前者中,在最小二乘意义上将PDE残差最小化,而相应的变分问题在后者中被最小化。因此,可以合理地期望深盖尔金方法在平滑的溶液方面更有效,而深丽兹方法则可以更好地适用于低规范溶液。但是,通过许多示例,我们观察到,Deep Ritz方法甚至可以胜过深层的Galerkin方法,即使对于平滑溶液,尺寸也明确依赖,而深盖尔金方法也可以优于低规范溶液的深层Ritz方法。此外,在某些情况下,当可以确切地实现边界条件时,我们发现这种策略不仅提供了更好的近似解决方案,而且还提供了培训过程。

Recent years have witnessed growing interests in solving partial differential equations by deep neural networks, especially in the high-dimensional case. Unlike classical numerical methods, such as finite difference method and finite element method, the enforcement of boundary conditions in deep neural networks is highly nontrivial. One general strategy is to use the penalty method. In the work, we conduct a comparison study for elliptic problems with four different boundary conditions, i.e., Dirichlet, Neumann, Robin, and periodic boundary conditions, using two representative methods: deep Galerkin method and deep Ritz method. In the former, the PDE residual is minimized in the least-squares sense while the corresponding variational problem is minimized in the latter. Therefore, it is reasonably expected that deep Galerkin method works better for smooth solutions while deep Ritz method works better for low-regularity solutions. However, by a number of examples, we observe that deep Ritz method can outperform deep Galerkin method with a clear dependence of dimensionality even for smooth solutions and deep Galerkin method can also outperform deep Ritz method for low-regularity solutions. Besides, in some cases, when the boundary condition can be implemented in an exact manner, we find that such a strategy not only provides a better approximate solution but also facilitates the training process.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源