As someone who previously worked on NISQ processors and interested in trying out the latest generation of quantum processors, it was exciting to learn more about the new capabilities that are available:
- Real-time qubit measurements and reset combined with conditional operations open new opportunities for circuit design, such as the use of probabilistically generated magic states to reduce circuit depth.
- Gate fidelities are improved by an order of magnitude!
- It seems like quantum error correction can actually work on real hardware!
At the same time, some of the big challenges we struggled with before remain open problems:
- Trapped ion systems are slow. Current "error correction" capabilities are practically limited to error detection and post-selection - full-on error correction requires too big an overhead in terms of circuit depths. There is a need to carefully tailor the error correction code to the specific problem and hardware.
- Quantum chemistry use cases seem to have hit a wall in terms of the complexity of implementing the second-quantized Hamiltonians - circuit depths and required number of measurements become intractable well before one can use all of the available physical qubits. Switching from variational algorithms to subspace methods only partially addresses this.
And some other thorny issues mentioned during the discussion breaks:
- With superconducting quantum processors or other platforms with fixed qubit positions, one often has the luxury of choosing the best set of the qubits on the device and avoiding bad ones. This isn't supported on the Quantinuum processors due to the qubit shuttling - you have to use whatever you're given and can't keep track of which ions are the best to use. This might make the performance more unpredictable from day to day. One suggested approach was to instead perform a tomography on the different quantum logic regions of the device (where the gates are actually performed), to see if there is substantial variation in their fidelities.
- Because the gates are so slow, zero noise extrapolation (a simple and effective error mitigation scheme) have limited noise data to work with. Methods to generate more data (by probabilistically expanding only some of the two-qubit gates) need to execute many more circuits, giving a big overhead in terms of compilation time.
- Conditional circuit operations such as post-selection can substantially increase the number of shots required and expenses incurred by the end-user.
All in all, it was great to see the broad interest in the hardware and insightful technical questions from the audience. Looking forward to seeing what Singaporean researchers get up to with the new Helios processor once it is installed and operational in Singapore late next year!
No comments:
Post a Comment