Wednesday, March 27, 2024

CQT Colloquium on strongly interacting photons in superconducting qubit arrays

 Yesterday at CQT Jonathan Simon from Stanford University gave a wonderful colloquium talk on "Many-body Ramsey Spectroscopy in the Bose Hubbard Model," covering experimental studies of strongly interacting quantum fluids of photons in arrays of superconducting qubits, spanning work from 2019 on the preparation of photonic Mott insulating states to ongoing studies of entangled many-body states of light.

A good colloquium talk should understandable to a broad audience (ideally, including undergraduates) while still going into enough depth to keep specialists in the topic interested. If you cannot frame your research in terms of some simplified model, chances are you do not yet fully understand it.

Simon did this using the neat example of emergence in 2D point clouds: observing non-trivial emergent properties requires three key ingredients: many particles, interactions between the particles, and dissipation (in this case, friction) to allow the system to relax to some ordered state. When all three are included, the cloud self-organizes into a triangular lattice with properties qualitatively different from those of the individual constituent particles, supporting low energy vibrational modes (phonons).

Typically, a colloquium talk will cover research spanning several years. It is important to have some clear common motivation. In this case, the question of how to make quantum states of light exhibit similar emergent properties? Three ingredients are required: give photons an effective mass, achieve strong photon-photon interactions, and introduce a suitable form of dissipation that allows the system to relax to some interesting equilibrium state while preserving non-trivial many-particle effects.

After this framing, the talk went deep into how these ingredients can be realized using arrays of superconducting qubits, and how the relevant dimensionless quantities (interaction strength vs hopping strength vs photon lifetime) compare to other platforms, such as cold atoms (handy, given the mix of expertise in the audience).

The talk finished with a vision for the future - to connect this "photonic quantum simulator" to a small-scale quantum processor to test NISQ-friendly algorithms, such as shadow tomography of many-body quantum states.

A recording will probably be uploaded to the CQT Youtube page later. In the meantime, related talks given at JQI and Munich are already available online!

Wednesday, March 20, 2024

ChatGPT, write my article introduction! And editors versus referees

This paper with an introduction brazenly written by ChatGPT attracted a lot of attention last week. How is it that the first line of the introduction could remain in the final version without anyone (authors, editors, referees, proofing staff) noticing? 

Some said this was no big deal - aren't paper introductions boilerplate junk that nobody reads anyway? Yes and no. While an expert in the field might not expect to learn anything new from reading a paper introduction, it is nevertheless important as a means for the authors to convince the reader that they sufficiently understand the context of the research and are in a position to make a novel and significant contribution.

Others argued this was an example of the failure of peer review and the current scientific publishing system - junk papers that no one (not even the authors!) read.

Who exactly is at fault here (apart from the authors, obviously) - the journal editors or the referees?

Actually, it is not the referees' job to proofread manuscripts! Many referees will not bother to laboriously point out all the obvious typos in a manuscript and will purely focus on the scientific content in their reports. Sloppiness that the authors fail to notice themselves will detract from the credibility of the science reported and may be more damning than scathing technical criticism by the referees that might not be adequately addressed in the final paper!

The editors should have caught this in their initial screening. One of the roles of an editor is to curate content and ensure that the valuable time of the volunteer referees is not wasted on obviously incorrect, unconvincing, or not even wrong manuscripts. At the same time, we don't want to waste the authors' time by agreeing to send the manuscript out for review and then being unable to secure willing referees!

At Physical Review A we desk reject about half of the manuscripts we receive without sending out for peer review. While this might sound like a lot, these manuscripts tend to be of much lower quality than those that are eventually published. There are several red flags that make us lean towards desk rejection:

Out of journal scope. Does the manuscript report results that are of interest to the readers of the journal? One simple way to gauge this is to check the reference list of the finished manuscript - if you are only referring to works from other disciplines, this is not by itself grounds for rejection, but it is a hint that you need to be particularly careful with explaining the relevance of your work to the journal's specific audience.

Poor presentation. Obvious typos. Ugly figures. No figures (passable in rare cases). Too many figures. Illegible axis markers. Incorrectly formatted equations and symbols. Basic stuff, but many authors sadly cannot be bothered.

Transfer after rejection from a sister journal. This one is surprisingly common, particularly for research topics which fall in the scope of multiple APS journals. Most often we see transfers from PR Applied and PRB, which have higher impact factors, so the authors decide to try their luck with PRA. But the standards of all these journals are the same, regardless of their impact factors that fluctuate from year to year. This means that rejection from PR Applied or PRB generally precludes publication in PRA, except in special cases.

No significant new physics. This is the most controversial. Who is the editor to decide what is significant - isn't that the job of the referees? We do lean towards giving the benefit of the doubt and sending out to referees for this one. The manuscripts that fail this test generally lack the "so, what?" factor - assuming all the claims are correct, have we learned anything new? It is always possible to tweak models, change terms, make them a bit more complicated, and then apply analysis tools that are standard for the field to get something that is technically correct. But the impact of such technically correct works will be limited unless they open up something new - a novel experimental platform, a way to push the limits of existing theory, and so on.

It is never pleasant for one of your articles to be rejected without review, but it is actually the second best response you can receive! The likely alternative would be for you to wait months before receiving a similar rejection on the basis of anonymous referee reports!

Tuesday, March 12, 2024

Postdoc Opening at Tohoku University: Condensed Matter and AMO Theory

Tomoki Ozawa's group at the Advanced Institute for Materials Research, Tohoku University, has a postdoc opening (the official title would be Specially Appointed Assistant Professor, or Tokunin-Jokyo in Japanese), which can start as soon as the decision is made or from April 1, 2025 at the latest. The position lasts for three years. 
The group works on theoretical condensed matter physics and AMO (atomic, molecular, and optical) physics, in particular on topological phases and/or many-body physics in these systems. A part of the salary of this position will be from the KAKENHI Kiban-B grant, “Geometrical effects in non-Hermitian quantum systems." 
The application deadline is April 30, 2024, and the application should be sent through Academic Jobs Online from the following link:

Friday, March 8, 2024

Vanishing Papers, Vanishing Journals

 A highlight in Nature this week: Millions of research papers at risk of disappearing from the Internet 

What happens when a publisher goes bust? Are their journal articles lost forever? 

The digital object identifier (DOI) system used by academic journals, among others, is supposed to be robust to this; the URL to which a DOI points can be updated when the original source is no longer available, provided another source exists. Dark archives such as LOCKSS were developed to preserve scholarly articles and keep them available after the original publisher is no longer around. 

However, according to M. P. Eve writing in the Journal of Librarianship and Scholarly Communication, a substantial fraction (27%) of journal articles linked to a DOI are not preserved in any centralised archive, making them at risk of being lost forever!

This is a particularly important problem for the growing number of for-profit open access journals. They make their money upon publication. Who will pay for the preservation of their articles? Under the subscription model where the journal holds the article copyright, this is an asset that remains valuable even after the journal has ceased publishing new articles. This is not the case for open access journals - they are only as valuable as long as they maintain a steady stream of submissions and published articles.

Preservation of the scientific record is important. The American Physical Society maintains and sells access to their archive of publications dating all the way back to 1893. How many of today's open access journals will remain accessible a hundred years from now?