论文标题

用于建模物理系统的变压器

Transformers for Modeling Physical Systems

论文作者

Geneva, Nicholas, Zabaras, Nicholas

论文摘要

变压器由于能够在文本中建模长期依赖性而广泛用于自然语言处理。尽管这些模型实现了许多与语言相关的任务的最新性能,但它们在自然语言处理领域之外的适用性很小。在这项工作中,我们建议使用变压器模型来预测代表物理现象的动态系统。基于Koopman的嵌入式的使用提供了一种独特而有力的方法,可以将任何动态系统投影到向量表示中,然后可以由变压器预测。所提出的模型能够准确预测科学机器学习文献中常用的各种动力学系统和胜过经典方法。

Transformers are widely used in natural language processing due to their ability to model longer-term dependencies in text. Although these models achieve state-of-the-art performance for many language related tasks, their applicability outside of the natural language processing field has been minimal. In this work, we propose the use of transformer models for the prediction of dynamical systems representative of physical phenomena. The use of Koopman based embeddings provide a unique and powerful method for projecting any dynamical system into a vector representation which can then be predicted by a transformer. The proposed model is able to accurately predict various dynamical systems and outperform classical methods that are commonly used in the scientific machine learning literature.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源