论文标题

使用HHL量子算法加速训练单层二元神经网络

Accelerating the training of single-layer binary neural networks using the HHL quantum algorithm

论文作者

Alarcon, Sonia Lopez, Merkel, Cory, Hoffnagle, Martin, Ly, Sabrina, Pozas-Kerstjens, Alejandro

论文摘要

二进制神经网络是一种有前途的技术,用于实施具有降低存储和计算要求的高效深层模型。然而,这些培训仍然是一个计算密集的问题,它随着层大小和数据输入而急剧发展。该计算的核心是线性回归问题。 Harrow-Hassidim-lloyd(HHL)量子算法获得了相关性,这要归功于其提供含有线性方程式解决方案解决方案的量子状态。该溶液在量子电路输出的叠加中编码。尽管这似乎为训练神经网络的线性回归问题提供了答案,但它也带有多个难以避免的障碍。但是,本文表明,可以从HHL的量子机械实现中提取有用的信息,并用于减少在经典侧找到解决方案的复杂性。

Binary Neural Networks are a promising technique for implementing efficient deep models with reduced storage and computational requirements. The training of these is however, still a compute-intensive problem that grows drastically with the layer size and data input. At the core of this calculation is the linear regression problem. The Harrow-Hassidim-Lloyd (HHL) quantum algorithm has gained relevance thanks to its promise of providing a quantum state containing the solution of a linear system of equations. The solution is encoded in superposition at the output of a quantum circuit. Although this seems to provide the answer to the linear regression problem for the training neural networks, it also comes with multiple, difficult-to-avoid hurdles. This paper shows, however, that useful information can be extracted from the quantum-mechanical implementation of HHL, and used to reduce the complexity of finding the solution on the classical side.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源