论文标题

避免使用天然梯度优化器中的量子量子本质体中的局部最小值

Avoiding local minima in variational quantum eigensolvers with the natural gradient optimizer

论文作者

Wierichs, David, Gogolin, Christian, Kastoryano, Michael

论文摘要

我们比较了在变化量子本质体(VQES)的背景下比较BFGS优化器,ADAM和自然梯度下降(NATGRAD)。我们系统地分析了他们在横向场ISING模型(TFIM)以及过多散光的电路上的QAOA ANSATZ上的性能,具有打破汉密尔顿的对称性的能力。 BFGS算法通常无法找到大约20次旋转的系统的全球最低限度,而亚当很容易被捕获在当地的最小值中。另一方面,Natgrad在所有考虑的系统尺寸上表现出稳定的性能,尽管每个时期的成本明显更高。与大多数基于经典的梯度学习形成鲜明对比的是,所有优化者的性能在看似良性的ANSATZ类过分散文时会降低,而BFGS和Adam的失败频率更高,更严重。 Heisenberg XXZ模型的其他测试证实了BFG在高维度中的准确性问题,但它们也揭示了Natgrad的一些缺点。我们的结果表明,在选择基于梯度的优化器和VQE的参数化时,需要进行大量护理。

We compare the BFGS optimizer, ADAM and Natural Gradient Descent (NatGrad) in the context of Variational Quantum Eigensolvers (VQEs). We systematically analyze their performance on the QAOA ansatz for the Transverse Field Ising Model (TFIM) as well as on overparametrized circuits with the ability to break the symmetry of the Hamiltonian. The BFGS algorithm is frequently unable to find a global minimum for systems beyond about 20 spins and ADAM easily gets trapped in local minima. On the other hand, NatGrad shows stable performance on all considered system sizes, albeit at a significantly higher cost per epoch. In sharp contrast to most classical gradient based learning, the performance of all optimizers is found to decrease upon seemingly benign overparametrization of the ansatz class, with BFGS and ADAM failing more often and more severely than NatGrad. Additional tests for the Heisenberg XXZ model corroborate the accuracy problems of BFGS in high dimensions, but they reveal some shortcomings of NatGrad as well. Our results suggest that great care needs to be taken in the choice of gradient based optimizers and the parametrization for VQEs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源