Monday, February 27, 2023

Academic working conditions

Comments on a Comment published a few weeks ago in Nature Human Behaviour: Quality research needs good working conditions.

There is "a lack of permanent positions with dedicated research funding, leading to an overreliance on project-based funding with short-term research positions." Unless one lands a tenured position, the longest term of reliable employment one is likely to encounter is during the PhD studies (3+ years). Postdocs depend on periodic renewal of short-term 1 or 2-year contracts and face the prospect of probably needing to move to a different city or country if they need to take up a new position. It's hard.

The relation between turnover and research quality is nuanced. Yes, in certain circumstances pressure can foster creativity and productivity, but the threat of non-renewal is not the only form of pressure that can be applied. Internal project deadlines ("if we don't have a promising result by x, let's try a different line of research"). Performance bonuses. Conference submission deadlines. There are many possibilities.

On the flip side, there are many examples of stagnant research institutions dominated by faculty members nearing retirement with no incentives to undertake quality new research. The problem with research is that it isn't always successful. It is hard for an outsider to distinguish hard but unfruitful work from slacking off.

Unfortunately the terms of employment contracts are typically controlled by upper university management. Thus, even those tenured faculty who would prefer to have a more stable team composition have little power to effect change.

 

Thursday, February 23, 2023

Quantum error correction - now published

The paper "Suppressing quantum errors by scaling a surface code logical qubit" by the Google Quantum AI team was published yesterday in Nature. I previously wrote about this work when the preprint was posted to arXiv last year.

The peer review file accompanying the paper is an interesting read. The authors mention challenges involved in making superconducting quantum computers that are robust to catastrophic noise induced by cosmic rays - radiation shielding will not be sufficient:

"Given the current error scale of an impact event is essentially unsurvivable, the event rate needs to be at least on the order of the computation time. Reasonable estimates for fault tolerant computations are generally measured in hours (c.f. [5] which proposes 8 hours for Shor’s algorithm), so current event rates should need to improve by around 3000x from current event rates. Additionally, if chip areas grow approximately proportional to number of qubits, given that the event rate is directly proportional to chip area, the event rate will also increase by around 100,000x from this effect, which will need to be overcome as well. The 10x reduction offered by the use of lead shielding and a low-radiation laboratory is nowhere near sufficient to solve this problem at scale."

Solving this problem will require new superconducting qubit and quantum processor designs that can be more robust to cosmic rays, e.g. localizing their disruptive effects to a small fraction of the qubits so that quantum error correction can still be applied.

 The authors also write in their reply to referees that the performance fine-tuning required for the distance-5 code to match the performance of the distance-3 code took place over 6 weeks.

As I wrote before - this is an impressive achievement, but in the race to build a working, fault-tolerant quantum computer it should be seen of the end of the beginning, not the beginning of the end!

Monday, February 13, 2023

Snippets from the topological data analysis workshop

 I had the pleasure of visiting KIAS last week to attend their Workshop on Topological Data Analysis: Mathematics, Physics, and Beyond.

The programme featuring so many pure mathematics-focused talks was daunting at first, but ultimately I learnt a lot more than I would have by presenting the same work at a physics conference. This serves as a nice reminder: When we become experts at a highly specialised topic in our PhD studies (and beyond), it is easy to lose sight of the bigger picture. Changing fields or even attending a wide range of seminars outside our own expertise sparks new ideas.

Some useful tidbits from the talks:

  • The Vietoris-Rips complex is not stable with respect to outliers. For example, adding a few points to a middle of a circular point cloud will completely change the form of the 1D persistence diagram (replacing the single high persistence cycle with low-persistence cycles). Stability theorems for persistence diagrams refer to small perturbations of a fixed set of point.
  • Similar persistence diagrams do not imply similar data, in fact datasets with an arbitrarily large Hausdorff distance can have the same persistence diagrams.
  • Mathematicians like TDA because it involves interesting problems with elegant solutions. But not all these elegant methods end up being useful in practice.
  • The chemical and biological sciences benefit from having large datasets publicly available for benchmarking and standardised scoring of new machine learning / data analysis methods including TDA. This allows demonstrations of new heuristic methods to be convincing. 
  • TDA can be used to estimate geometric features of data, including their (fractal) dimension, which is useful for understanding the performance of neural networks.
  • The generators of 0-dimensional persistence diagrams give the minimal spanning tree for the data. This widely-used "folklore" was only rigorously proved in 2020!

Friday, February 3, 2023

Entanglement-enhanced quantum sensing: fact or fiction?

Quantum sensing, the use of quantum systems to perform precision measurements, attracts enormous interest as an potential application of engineered quantum systems being developed. According to this review article, there are three classes of quantum sensing:

1. Sensing based on systems with quantized energy levels such as superconducting qubits, including SQUID magnetometers (already commercialized).

2. Sensing based on quantum coherence or wave-like properties, including noise suppression using squeezing (e.g. in gravitational wave detectors) and macroscopic quantum states of atoms for precision gravimetry and inertial navigation (under development / being commercialized).

3. Entanglement-enhanced sensing to achieve precision beyond what is attainable classically (research in progress). 

It is believed that only entanglement-enhanced sensing makes use of the full power of quantum mechanics (i.e. many-body entangled states intractable for classical computers). 

The growing availability of large controlled quantum systems has led to huge interest in entanglement-enhanced sensing schemes, with many publications in high impact journals, but not everyone is convinced.

Critiques of recent high-profile experiments on entanglement-enhanced sensing published in Nature and Nature Physics have been posted to arXiv: arXiv:2208:14816, arXiv:2301.04396, and (today) arXiv:2302.00733. The first is a particularly interesting read, since it has been updated to include correspondence with the paper authors and Nature Physics editors, who declined to publish it.

I do not work in this field. I do not have the expertise to judge whether the criticism is valid or not. But it seems to me that the comments come from a knowledgeable expert, are written in a scientific style, and are of a reviewable standard. Moreover, the claims in the critiqued articles (unprecedented sensitivity at measuring some quantity) are quantitative, and can thus be unambiguously proved or disproved. Thus, it should be concerning that while one of the articles claiming entanglement-enhanced sensitivity has already been cited 30 times according to Google Scholar, the criticism seems to be ignored - not cited, not responded to, not even upvoted on scirate.

Several years ago there was a similar controversy in photonics, with many researchers racing to be the first to claim to demonstrate lasing in a variety of exotic materials. In response to this, Nature Photonics introduced a "laser checklist" to ensure that all submissions reporting claims of lasing provide a standardized set of measurements and experimental details which can be scrutinized and easily compared between different platforms and research groups. Perhaps something similar can be done for entanglement-enhanced sensing papers?