Thursday, September 15, 2022

Fusion-based quantum computation

A brief summary of the great talks by Mercedes Gimeno-Segovia and Terry Rudolph at the 6th Quantum Africa Conference on PsiQ's approach for building a large scale photonic fault-tolerant quantum computer using entangling measurements: fusion-based quantum computation. See also an earlier perspective by Rudolph published in APL Photonics.

Fusion-based quantum computation can be seen as a middle ground between the gate model (employed in most other platforms for quantum computing) and measurement-based quantum computation, which replaced nonlinear operations (hard to do using quantum states of light) with a suitable sequence of single qubit measurements.

The first step in measurement-based quantum computation is to prepare a large scale entangled resource state that is big enough to perform the desired computation. This is hard.

Fusion-based on the other hand employs two-qubit (entangling) measurements as its basic building block. This allows the calculation to be performed using a steady stream of constant-size resource states. These resource states also require multiple photons to be entangled, which is hard, but at least their size is bounded. Creating constant size resource states with sufficient fidelity and speed is the challenge which PsiQ believes they can solve.

What distinguishes PsiQ from their competitors is that their platform is easily scalable to billions of logical qubits and gates using commercial silicon lithography employed to make all our conventional electronic computers. I think the only other platform which can make such a claim are the silicon qubits being pursued by a few groups, including the UNSW spin-off Silicon Quantum Computing. By contrast, the quantum processors currently available on the cloud (superconducting circuits and ions trap) are (relatively) large and clunky devices requiring painstaking optimisation and babysitting but nevertheless might only support a few logical qubits if quantum error correction can be implemented.

An interesting point raised in the talks was that error-corrected devices will be very different from any of the currently-available quantum processors; once you have working error correction many of the details of the physical qubits become unimportant. Thus, PsiQ's approach is to create billions of "good-enough" physical qubits rather than a lower number of perfect qubits.

No comments:

Post a Comment