论文标题

Beinit:避免在变化量子算法中贫瘠的高原

BEINIT: Avoiding Barren Plateaus in Variational Quantum Algorithms

论文作者

Kulshrestha, Ankit, Safro, Ilya

论文摘要

贫瘠的高原是优化变异量子算法的一个臭名昭著的问题,并在寻求更有效的量子机学习算法时构成了关键的障碍。已经确定了贫瘠的高原原因的许多潜在原因,但是在实践中,很少有人提出解决方案来避免它们。现有的解决方案主要集中于单一门参数的初始化,而无需考虑输入数据引起的更改。在本文中,我们提出了一种替代策略,该策略通过从beta分布中绘制统一门的参数初始化。从数据中估算了Beta分布的超参数。为了进一步预防训练期间贫瘠的高原,我们在每个梯度下降步骤都增加了一种新颖的扰动。我们将这些想法共同完成,我们从经验上表明,我们提出的框架大大降低了复杂的量子神经网络被卡在贫瘠的高原中的可能性。

Barren plateaus are a notorious problem in the optimization of variational quantum algorithms and pose a critical obstacle in the quest for more efficient quantum machine learning algorithms. Many potential reasons for barren plateaus have been identified but few solutions have been proposed to avoid them in practice. Existing solutions are mainly focused on the initialization of unitary gate parameters without taking into account the changes induced by input data. In this paper, we propose an alternative strategy which initializes the parameters of a unitary gate by drawing from a beta distribution. The hyperparameters of the beta distribution are estimated from the data. To further prevent barren plateau during training we add a novel perturbation at every gradient descent step. Taking these ideas together, we empirically show that our proposed framework significantly reduces the possibility of a complex quantum neural network getting stuck in a barren plateau.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源