论文标题

用分段机加速原子模拟以古典力场的成本学习了从头算的潜力

Accelerating Atomistic Simulations with Piecewise Machine Learned Ab Initio Potentials at Classical Force Field-like Cost

论文作者

Zhang, Yaolong, Hu, Ce, Jiang, Bin

论文摘要

如今,机器学习方法已成为易于使用的工具,用于构建具有从头算准确性的高维间潜力。尽管机器学到的原子间电位通常比第一原理计算要快的数量级,但它们比使用更复杂的结构描述符的价格要比经典的力场慢得多。为了弥合此效率差距,我们提出了一种具有简单的基于分段切换函数的描述符的嵌入式原子神经网络方法,从而与邻居原子的数量进行了有利的线性缩放。数值示例验证了该分段机器学习模型的数量级可以比各种流行的机器学习势更快,而对于金属和共价材料的精确度可比,从而接近了最快的嵌入原子方法的速度(即每CPU核心几种/原子(即几种ATOM/ATOM))。这种方法的极端效率有望在对非常大的系统和/或长时间尺度的第一原理原子模拟中的潜力。

Machine learning methods have nowadays become easy-to-use tools for constructing high-dimensional interatomic potentials with ab initio accuracy. Although machine learned interatomic potentials are generally orders of magnitude faster than first-principles calculations, they remain much slower than classical force fields, at the price of using more complex structural descriptors. To bridge this efficiency gap, we propose an embedded atom neural network approach with simple piecewise switching function based descriptors, resulting in a favorable linear scaling with the number of neighbor atoms. Numerical examples validate that this piecewise machine learning model can be over an order of magnitude faster than various popular machine learned potentials with comparable accuracy for both metallic and covalent materials, approaching the speed of the fastest embedded atom method (i.e. several μs/atom per CPU core). The extreme efficiency of this approach promises its potential in first-principles atomistic simulations of very large systems and/or in long timescale.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源