论文标题

两种量子算法,用于间距分离的位置之间的通信

Two quantum algorithms for communication between spacelike separated locations

论文作者

Datta, Amitava

论文摘要

“无通信”定理禁止通过表明爱丽丝对纠缠系统的任何测量都无法改变鲍勃状态的降低密度矩阵,因此鲍勃使用的任何测量算子的期望值保持不变。我们认为,“无通信”定理的证明是不完整的,并且可以通过使用Ancilla Qubits在高维的希尔伯特空间中的状态歧视来进行超光通信。我们通过州歧视提出了两种量子算法,以在两个观察者爱丽丝和鲍勃之间进行通信,该算法位于空间般的分离位置。爱丽丝(Alice)和鲍勃(Bob)共享一个Qubit,每个Qubit每个铃铛状态$ \ frac {1} {\ sqrt 2}(\ ket {00}+\ ket {11})$。在发送经典信息的同时,爱丽丝衡量了她的Qubit,并以两种不同的方式折叠了鲍勃的贵族状态,具体取决于她要发送$ 0 $还是$ 1 $。爱丽丝的第一个测量是在计算基础上进行的,第二个测量方法是在将hadamard变换应用于量子的计算基础上。鲍勃的第一个算法以错误的概率$ <\ frac {1} {2^k} $检测到经典位,但是爱丽丝和鲍勃需要共享$ k $ bell状态以传达单个经典位。 Bob的第二算法更为复杂,但是Bob可以使用四个Ancilla Qubits确定性地检测到经典位。我们还讨论了算法的可能应用。

The `no communication' theorem prohibits superluminal communication by showing that any measurement by Alice on an entangled system cannot change the reduced density matrix of Bob's state, and hence the expectation value of any measurement operator that Bob uses remains the same. We argue that the proof of the `no communication' theorem is incomplete and superluminal communication is possible through state discrimination in a higher-dimensional Hilbert space using ancilla qubits. We propose two quantum algorithms through state discrimantion for communication between two observers Alice and Bob, situated at spacelike separated locations. Alice and Bob share one qubit each of a Bell state $\frac{1}{\sqrt 2}(\ket{00}+\ket{11})$. While sending classical information, Alice measures her qubit and collapses the state of Bob's qubit in two different ways depending on whether she wants to send $0$ or $1$. Alice's first measurement is in the computational basis, and the second measurement is again in the computational basis after applying Hadamard transform to her qubit. Bob's first algorithm detects the classical bit with probability of error $<\frac{1}{2^k}$, but Alice and Bob need to share $k$ Bell states for communicating a single classical bit. Bob's second algorithm is more complex, but Bob can detect the classical bit deterministically using four ancilla qubits. We also discuss possible applications of our algorithms.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源