论文标题

机器学习扩散蒙特卡洛力量

Machine Learning Diffusion Monte Carlo Forces

论文作者

Huang, Cancan, Rubenstein, Brenda M.

论文摘要

扩散的蒙特卡洛(DMC)是用于计算分子和材料的电子特性的最精确技术之一,但使用该技术在经济上计算力通常仍然是一个挑战。结果,使用扩散的蒙特卡洛力的从头算分子动力学模拟和几何优化通常是无法触及的。加速“ DMC力”计算的一种潜在方法是从DMC能量计算中学习这些力。在这项工作中,我们使用Behler-Parrinello神经网络从DMC能量计算中学习DMC力,以进行小分子的几何优化和分子动力学模拟。我们通过对势能表面,动力学和优化预测进行严格比较,在没有明确的能源数据和优化预测的情况下,源于嘈杂的力量数据(DFT)仿真(DFT)模拟和机器学习模型,从而源于学习力和嘈杂能量数据的独特挑战,并从嘈杂的能量数据中介绍了挑战。我们显示了三个小分子-C2,H2O和CH3CL-机器学到的DMC动力学可以在已知实验结果的几%内以典型成本的100%重现平均键长和角度。我们的工作描述了一种急需的方法,即对高临界性,DMC pess进行动力学模拟,并在当前的算法约束时生成DMC质量分子几何形状。

Diffusion Monte Carlo (DMC) is one of the most accurate techniques available for calculating the electronic properties of molecules and materials, yet it often remains a challenge to economically compute forces using this technique. As a result, ab initio molecular dynamics simulations and geometry optimizations that employ Diffusion Monte Carlo forces are often out of reach. One potential approach for accelerating the computation of "DMC forces" is to machine learn these forces from DMC energy calculations. In this work, we employ Behler-Parrinello Neural Networks to learn DMC forces from DMC energy calculations for geometry optimization and molecular dynamics simulations of small molecules. We illustrate the unique challenges that stem from learning forces without explicit force data and from noisy energy data by making rigorous comparisons of potential energy surface, dynamics, and optimization predictions among ab initio Density Functional Theory (DFT) simulations and machine learning models trained on DFT energies with forces, DFT energies without forces, and DMC energies without forces. We show for three small molecules - C2, H2O, and CH3Cl - that machine learned DMC dynamics can reproduce average bond lengths and angles within a few percent of known experimental results at a 100th of the typical cost. Our work describes a much-needed means of performing dynamics simulations on high-accuracy, DMC PESs and for generating DMC-quality molecular geometries given current algorithmic constraints.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源