论文标题
近期量子自然语言处理的基础
Foundations for Near-Term Quantum Natural Language Processing
论文作者
论文摘要
我们为近期量子自然语言处理(QNLP)提供概念和数学基础,并以量子计算机科学家友好的术语进行。我们选择了说明性演示方式,并提供了支持经验证据和有关数学一般性的正式陈述的参考。 我们回想起我们采用规范的自然语言量子模型如何将语言含义与丰富的语言结构相结合,最著名的是语法。特别是,在与量子系统的仿真相当的情况下,将QNLP建立为量子本地,将QNLP建立为量子本地,这一事实是将QNLP建立为量子的事实。此外,现在领先的嘈杂的中间尺度量子(NISQ)范式用于编码有关量子硬件的经典数据,变量量子电路,使NISQ非常适合QNLP友好:语言结构可以作为免费的午餐午餐编码,与之相反,与昂贵的经典经典编码相反。 QNLP任务的量子加速已在与Will Zeng的先前工作中建立。在这里,我们提供了更广泛的任务,所有任务都具有相同的优势。 图形推理是QNLP的核心。首先,量子模型通过分类量子力学的图形形式主义将语言解释为量子过程。其次,这些图是通过ZX-Calculus翻译成量子电路的。然后,含义的参数化成为要学习的电路变量。 我们对量子电路中语言结构的编码还体现了一种新的方法,该方法是通过将语言结构放置在Wittgenstein的含义IS-Contept的核心,从而超越了主流AI中当前标准。
We provide conceptual and mathematical foundations for near-term quantum natural language processing (QNLP), and do so in quantum computer scientist friendly terms. We opted for an expository presentation style, and provide references for supporting empirical evidence and formal statements concerning mathematical generality. We recall how the quantum model for natural language that we employ canonically combines linguistic meanings with rich linguistic structure, most notably grammar. In particular, the fact that it takes a quantum-like model to combine meaning and structure, establishes QNLP as quantum-native, on par with simulation of quantum systems. Moreover, the now leading Noisy Intermediate-Scale Quantum (NISQ) paradigm for encoding classical data on quantum hardware, variational quantum circuits, makes NISQ exceptionally QNLP-friendly: linguistic structure can be encoded as a free lunch, in contrast to the apparently exponentially expensive classical encoding of grammar. Quantum speed-up for QNLP tasks has already been established in previous work with Will Zeng. Here we provide a broader range of tasks which all enjoy the same advantage. Diagrammatic reasoning is at the heart of QNLP. Firstly, the quantum model interprets language as quantum processes via the diagrammatic formalism of categorical quantum mechanics. Secondly, these diagrams are via ZX-calculus translated into quantum circuits. Parameterisations of meanings then become the circuit variables to be learned. Our encoding of linguistic structure within quantum circuits also embodies a novel approach for establishing word-meanings that goes beyond the current standards in mainstream AI, by placing linguistic structure at the heart of Wittgenstein's meaning-is-context.