论文标题
关于具有内部变量的物理引导的神经网络在连续问题上的应用
On the application of Physically-Guided Neural Networks with Internal Variables to Continuum Problems
论文作者
论文摘要
预测物理学从历史上是基于数学模型的发展,这些数学模型描述了某些外部刺激和约束下系统的演变。此类数学模型的结构依赖于一组歇斯底里的假设,这些假设被假定在一定范围内的环境条件下实现。现在,正在提出一种新的观点,它使用物理知识来为人工神经网络的数据预测能力提供信息。这种数据驱动方法的特定扩展是具有内部变量(PGNNIV)的物理引导的神经网络:通用物理定律被用作神经网络中的约束,以使某些神经元值可以解释为系统的内部状态变量。这赋予网络具有拆卸的能力,以及更好的预测性能,例如更快的收敛,较少的数据需求和其他噪声过滤。此外,仅使用可观察的数据来训练网络,并且可以根据训练过程提取内部状态方程,因此无需明确说明内部状态模型的特定结构。我们将这种新方法扩展到连续的物理问题,在仅在训练集中使用可测量的值时再次显示其预测性和解释能力。我们表明,可以使用并扩展用于考虑连续物理学中的标准功能运算符的数学运算符,以在深度学习方法中进行图像分析,从而为两者建立了一个共同的框架。提出的方法证明了其针对某些问题(包括异质和非线性特征)发现内部组成型状态方程的能力,同时还保持了整个数据集覆盖的预测能力,并具有单个评估的成本。
Predictive Physics has been historically based upon the development of mathematical models that describe the evolution of a system under certain external stimuli and constraints. The structure of such mathematical models relies on a set of hysical hypotheses that are assumed to be fulfilled by the system within a certain range of environmental conditions. A new perspective is now raising that uses physical knowledge to inform the data prediction capability of artificial neural networks. A particular extension of this data-driven approach is Physically-Guided Neural Networks with Internal Variables (PGNNIV): universal physical laws are used as constraints in the neural network, in such a way that some neuron values can be interpreted as internal state variables of the system. This endows the network with unraveling capacity, as well as better predictive properties such as faster convergence, fewer data needs and additional noise filtering. Besides, only observable data are used to train the network, and the internal state equations may be extracted as a result of the training processes, so there is no need to make explicit the particular structure of the internal state model. We extend this new methodology to continuum physical problems, showing again its predictive and explanatory capacities when only using measurable values in the training set. We show that the mathematical operators developed for image analysis in deep learning approaches can be used and extended to consider standard functional operators in continuum Physics, thus establishing a common framework for both. The methodology presented demonstrates its ability to discover the internal constitutive state equation for some problems, including heterogeneous and nonlinear features, while maintaining its predictive ability for the whole dataset coverage, with the cost of a single evaluation.