Monday, September 26, 2022

More on quantum topological data analysis

 Two preprints on quantum topological data analysis (TDA) were posted to arXiv last week.

From the IBM team, Exponential advantage on noisy quantum computers. This is a follow-up to their proposal from last year, arXiv:2108.02811, implementing their linear depth NISQ-TDA algorithm (my earlier synopsis) using a trapped ion quantum processor. 

The introduction nicely frames the key ingredients required for demonstrating an end-to-end quantum advantage using near-term device, and then explains how NISQ-TDA meets all of the requirements, for example by having small input and output data sizes. Another advantage of NISQ-TDA is that the output observables, the Betti numbers, are known to be quantized topological invariants, leading to a robustness to shot noise and reducing the number of measurements required to obtain an accurate result. 

The article ends with the optimistic outlook that "a 96-qubit quantum computer with a two-qubit gate and measurement fidelity of ∼ 99.99% suffices to to achieve quantum advantage on the Betti number estimation problem." This really seems to me the strongest candidate for demonstrating a quantum advantage. Anyone interested in near-term quantum algorithms for machine learning applications should read this paper!

------

A second preprint was posted by researchers from Deloitte, Understanding the Mapping of Encode Data Through An Implementation of Quantum Topological Analysis. This work offers a more pessimistic viewpoint, stating that "the empirical results show the noise within the data is intensified with each encoding method as there is a clear change in the geometric structure of the original data, exhibiting information loss." However, their approach seems to be based on the original quantum TDA algorithm, based on Grover search and quantum phase estimation, which is not suited for near-term devices.

Thursday, September 15, 2022

Fusion-based quantum computation

A brief summary of the great talks by Mercedes Gimeno-Segovia and Terry Rudolph at the 6th Quantum Africa Conference on PsiQ's approach for building a large scale photonic fault-tolerant quantum computer using entangling measurements: fusion-based quantum computation. See also an earlier perspective by Rudolph published in APL Photonics.

Fusion-based quantum computation can be seen as a middle ground between the gate model (employed in most other platforms for quantum computing) and measurement-based quantum computation, which replaced nonlinear operations (hard to do using quantum states of light) with a suitable sequence of single qubit measurements.

The first step in measurement-based quantum computation is to prepare a large scale entangled resource state that is big enough to perform the desired computation. This is hard.

Fusion-based on the other hand employs two-qubit (entangling) measurements as its basic building block. This allows the calculation to be performed using a steady stream of constant-size resource states. These resource states also require multiple photons to be entangled, which is hard, but at least their size is bounded. Creating constant size resource states with sufficient fidelity and speed is the challenge which PsiQ believes they can solve.

What distinguishes PsiQ from their competitors is that their platform is easily scalable to billions of logical qubits and gates using commercial silicon lithography employed to make all our conventional electronic computers. I think the only other platform which can make such a claim are the silicon qubits being pursued by a few groups, including the UNSW spin-off Silicon Quantum Computing. By contrast, the quantum processors currently available on the cloud (superconducting circuits and ions trap) are (relatively) large and clunky devices requiring painstaking optimisation and babysitting but nevertheless might only support a few logical qubits if quantum error correction can be implemented.

An interesting point raised in the talks was that error-corrected devices will be very different from any of the currently-available quantum processors; once you have working error correction many of the details of the physical qubits become unimportant. Thus, PsiQ's approach is to create billions of "good-enough" physical qubits rather than a lower number of perfect qubits.

Friday, September 9, 2022

Quantum Africa Conference (QA6)

Quantum Africa is a conference covering quantum information and quantum computing running next week (Monday 12th to Friday 16th of September 2022).

It will be held in hybrid mode with a mix of in-person and online talks.

The talks should be quite pedagogical and will be given by outstanding researchers from academia and industry, including Steve Girvin (Yale, USA), Francesco Petruccione (UKZN, South Africa), Pedram Roushan (Google Quantum), and Terry Rudolph (PsiQuantum).

Talks will be recorded for later on-demand viewing for registered participants. Registration is free for online attendance!

Wednesday, September 7, 2022

Quantum computing debated in The Financial Times

Criticism of quantum computing hype and a rebuttal recently appeared in The Financial Times. The first article argues that even "well-established" applications of future quantum computers - breaking encryption and efficient quantum chemistry calculations - may not be useful in practice. The second article notes that even though there is tremendous hype, there is also slow but steady progress in scaling up quantum processors and understanding which quantum algorithms might provide value and which will not.

It is worth emphasizing that quantum technologies are much broader than quantum computing. For example, quantum research in Singapore are also encompasses quantum communications and quantum sensing. While these areas a seen as being closer to useful commercial applications, there are still some important caveats:

Quantum communication technologies are often marketed as the solution to the problem of future quantum computers being able to break widely-used public key cryptography schemes, with quantum key distribution providing unbreakable encryption protected by the laws of physics. The reality is that sharing of encryption keys is just one part of a secure communications network; a far bigger problem is authentication - how can you prove the other party is who they claim to be? Indeed, the vast majority of data breaches or online scams are not due to encryption protocols being broken or passwords being hacked, but rather are a result of phishing attacks where the victim is tricked into believing the attacker is someone else. The UK's National Cyber Security Centre's position on quantum communication technologies is:

"Given the specialised hardware requirements of QKD over classical cryptographic key agreement mechanisms and the requirement for authentication in all use cases, the NCSC does not endorse the use of QKD for any government or military applications, and cautions against sole reliance on QKD for business-critical networks, especially in Critical National Infrastructure sectors.

In addition, we advise that any other organisations considering the use of QKD as a key agreement mechanism ensure that robust quantum-safe cryptographic mechanisms for authentication are implemented alongside them."

Quantum sensing promises the ability to perform measurements with precision unattainable using classical devices. This encompasses many well-established approaches based on quantum coherence, including SQUIDs, atomic clocks, atomic gravimeters, and squeezed light interferometers, and more speculative ideas based on large-scale quantum entanglement. The latter entanglement-based approaches have however attracted criticism (see for example this preprint).

In all these examples - quantum computing, quantum communications, and quantum sensing - useful technologies will not emerge from quantum researchers working in isolation. Collaboration with researchers working in other disciplines and industry is essential to keep quantum "solutions" honest and ensure that we are solving problems that need to be solved, and to establish that quantum techniques provide a better solution than well-established classical methods.