论文标题
多尺度神经操作员:快速学习无关的PDE求解器
Multiscale Neural Operator: Learning Fast and Grid-independent PDE Solvers
论文作者
论文摘要
气候,化学或天体物理学中的数值模拟在计算上太昂贵,对于高分辨率下的不确定性定量或参数探索。减少或替代模型的速度更快,但是传统的替代物是僵化或不准确和纯机器学习(ML)的基于数据的替代物的渴望。我们提出了一个混合,灵活的替代模型,该模型利用已知的物理学模拟大规模动力学,并将学习限制为难以模拟的项,该术语称为参数化或闭合,并捕获了细量对大型动力学的影响。利用神经操作员,我们是第一个学习独立于网格的,非本地和灵活的参数化的人。我们的\ textIt {多尺度神经操作员}是由多尺度建模的丰富文献进行的,具有准线性运行时复杂性,比最先进的参数化更准确或更灵活,并且在混乱方程的多尺度lorenz96上证明。
Numerical simulations in climate, chemistry, or astrophysics are computationally too expensive for uncertainty quantification or parameter-exploration at high-resolution. Reduced-order or surrogate models are multiple orders of magnitude faster, but traditional surrogates are inflexible or inaccurate and pure machine learning (ML)-based surrogates too data-hungry. We propose a hybrid, flexible surrogate model that exploits known physics for simulating large-scale dynamics and limits learning to the hard-to-model term, which is called parametrization or closure and captures the effect of fine- onto large-scale dynamics. Leveraging neural operators, we are the first to learn grid-independent, non-local, and flexible parametrizations. Our \textit{multiscale neural operator} is motivated by a rich literature in multiscale modeling, has quasilinear runtime complexity, is more accurate or flexible than state-of-the-art parametrizations and demonstrated on the chaotic equation multiscale Lorenz96.