论文标题

古典到量子卷积神经网络转移学习

Classical-to-quantum convolutional neural network transfer learning

论文作者

Kim, Juhyeon, Huh, Joonsuk, Park, Daniel K.

论文摘要

使用量子卷积神经网络(QCNN)的机器学习在量子和经典数据分类中都取得了成功。在先前的研究中,在几参数制度中,在相同的训练条件下,QCNN的分类准确性比其经典同行的分类精度更高。但是,由于量子电路的大小有限,大规模量子模型的一般性能很难检查,这可以在不久的将来可靠地实施。我们建议转移学习是在嘈杂的中间量子量子时代利用小QCNN的有效策略。在经典到量词转移学习框架中,QCNN可以通过使用预训练的经典卷积神经网络(CNN)来解决复杂的分类问题,而无需大规模量子电路。我们对QCNN模型进行了数值模拟,并在转移学习下进行了各种量子卷积和集合数据分类的汇总操作,其中经典的CNN经过了时尚持有数据的培训。结果表明,在相似的训练条件下,从经典到量子CNN的转移学习比纯粹的经典转移学习模型要好得多。

Machine learning using quantum convolutional neural networks (QCNNs) has demonstrated success in both quantum and classical data classification. In previous studies, QCNNs attained a higher classification accuracy than their classical counterparts under the same training conditions in the few-parameter regime. However, the general performance of large-scale quantum models is difficult to examine because of the limited size of quantum circuits, which can be reliably implemented in the near future. We propose transfer learning as an effective strategy for utilizing small QCNNs in the noisy intermediate-scale quantum era to the full extent. In the classical-to-quantum transfer learning framework, a QCNN can solve complex classification problems without requiring a large-scale quantum circuit by utilizing a pre-trained classical convolutional neural network (CNN). We perform numerical simulations of QCNN models with various sets of quantum convolution and pooling operations for MNIST data classification under transfer learning, in which a classical CNN is trained with Fashion-MNIST data. The results show that transfer learning from classical to quantum CNN performs considerably better than purely classical transfer learning models under similar training conditions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源