论文标题

超越贫瘠的高原:量子变异算法被陷阱淹没

Beyond Barren Plateaus: Quantum Variational Algorithms Are Swamped With Traps

论文作者

Anschuetz, Eric R., Kiani, Bobak T.

论文摘要

经典神经网络最重要的特性之一是它们的训练算法通常依赖于优化复杂的非凸损失功能。先前的结果表明,与经典神经网络中的情况不同,变化量子模型通常无法训练。研究最多的现象是这些量子模型的训练景观中贫瘠的高原开始,通常是在模型非常深的时候。对贫瘠高原的关注使该现象几乎与量子模型的训练性同义。在这里,我们表明贫瘠的高原只是故事的一部分。我们证明,从全球最小值中的任何恒定能量中,在任何恒定的能量内,都只有一小部分的变异量子模型(浅且没有贫瘠的高原),如果没有良好的初始猜测,则在任何恒定的恒定能量中都有一小部分局部最小值。我们还研究了从统计查询框架中研究变异量子算法的训练性,并表明对于各种量子模型的嘈杂优化是不可能的,而不是次指数的查询数量。最后,我们从数值上确认了各种问题实例的结果。尽管我们在这里排除了各种量子算法,但我们为某些类别的变分算法提供了乐观的理由,并讨论了展示此类算法的实际实用性的潜在方法。

One of the most important properties of classical neural networks is how surprisingly trainable they are, though their training algorithms typically rely on optimizing complicated, nonconvex loss functions. Previous results have shown that unlike the case in classical neural networks, variational quantum models are often not trainable. The most studied phenomenon is the onset of barren plateaus in the training landscape of these quantum models, typically when the models are very deep. This focus on barren plateaus has made the phenomenon almost synonymous with the trainability of quantum models. Here, we show that barren plateaus are only a part of the story. We prove that a wide class of variational quantum models -- which are shallow, and exhibit no barren plateaus -- have only a superpolynomially small fraction of local minima within any constant energy from the global minimum, rendering these models untrainable if no good initial guess of the optimal parameters is known. We also study the trainability of variational quantum algorithms from a statistical query framework, and show that noisy optimization of a wide variety of quantum models is impossible with a sub-exponential number of queries. Finally, we numerically confirm our results on a variety of problem instances. Though we exclude a wide variety of quantum algorithms here, we give reason for optimism for certain classes of variational algorithms and discuss potential ways forward in showing the practical utility of such algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源