Showing posts with label old papers. Show all posts
Showing posts with label old papers. Show all posts

Wednesday, December 6, 2023

Dark horse papers

A journal's impact factor - the number of citations it receives in a year divided by the number of papers published in the preceding years - is often used as a proxy for the importance of the papers it publishes. But the impact factor is a poor predictor for individual articles, since article citation distributions are heavy-tailed, so their mean is strongly affected by rare outliers that receive many more citations than a typical article. 

It is difficult to estimate the impact a paper will have before it is published. Thus, one can find papers published in top journals that after several years have only attracted a handful of citations - the editors and referees overestimated the impact the article would have. Similarly, there are papers that were only published in a specialized (e.g. local society-run) journal and ended up having a big impact. Some examples:

S. Aubry and G. Andre, Analyticity breaking and Anderson localization in incommensurate lattices, Ann. Israel Phys. Soc. 3, 133 (1980). A conference proceedings article with more than a thousand citations and even its own wikipedia page, influential as a simple analytically-solvable toy model of a localization transition.

T. Fukui, Y. Hatsugai, and H. Suzuki, Chern Numbers in Discretized Brillouin Zone: Efficient Method of Computing (Spin) Hall Conductances, J. Phys. Soc. Japan 75, 074716 (2006). This is essential reading for anyone who wants to numerically compute Berry curvatures, since it solves the problem of how to fix a smooth gauge to compute the k-space derivatives of the Bloch functions. Also has more than a thousand citations.

M. Fujita, K. Wakabayashi, K. Nakada, and K. Kusakabe, Peculiar Localized State at Zigzag Graphite Edge, J. Phys. Soc. Jpn. 65, 1920 (1996). This was a paper that was ahead of its time, showing that certain edge configurations of graphene give rise to strongly localized edge states. This was more of a theoretical curiosity, until samples of graphene were isolated a few years later, as you can see in the time series of citing articles:
 

These are just a few examples I've come across in my own research. There are many more out there! Do you have your own favourite example?

Wednesday, July 26, 2023

The test of time: photonics

As we all (should) know, journal impact factor is a terrible measure of the quality of an individual article. What is more important than where an article is published is whether it has long lasting impact, and the only way to determine this for sure is to wait!

I used Web of Science to look at the most highly cited original research articles in photonics from ten years ago. Here are the top ten:

1. Photonic Floquet topological insulators (2142 citations)

This was the first work to experimentally demonstrate topological edge states in two-dimensional waveguide arrays. It was this (and related works below) which really popularized the now-booming field of topological photonics. While I'm not surprised to see it among the top articles from 2013, I wasn't expecting it to be number one!

2. Terahertz Metamaterials for Linear Polarization Conversion and Anomalous Refraction (1443 citations)

This work falls within two highly active fields: terahertz photonics and metasurfaces. Like other early works on metasurfaces (a few more appear below), the concepts were demonstrated using metallic structures. Ongoing commercialization today was enabled by the development of low loss all-dielectric metasurfaces over the following decade.

3. Photonic topological insulators (1306 citations)

This paper showed theoretically that photonic systems could be used to emulate quantum spin Hall topological phases, by using metamaterials with a judiciously-engineered magneto-electric coupling to emulate a fermionic time-reversal symmetry.

4. Imaging topological edge states in silicon photonics (1113 citations)

Experimental demonstration of a two-dimensional Chern insulator topological phase using ring resonator lattices. This platform is now used extensively for topological laser experiments and exploring other exotic topological tight binding models, including higher order topological phases.

5. Metasurface holograms for visible light (1094 citations)

6. Three-dimensional optical holography using a plasmonic metasurface (976 citations)

These articles published back-to-back in Nature Communications use the metasurface concept to create a hologram using a metallic film with a subwavelength thickness.

7. Wireless sub-THz communication system with high data rate (970 citations)

This work sagely foresaw that viral tiktok memes would drive demand for higher bandwidth mobile data. Higher bandwidth requires higher carrier frequencies, so here the authors demonstrate experimentally wireless data transmission at frequencies approaching the THz band. 

8. Photonic spin Hall effect at metasurfaces (917 citations)

Polarization-controlled beam deflection using a metallic metasurface.

9. Highly efficient gate-tunable photocurrent generation in vertical heterostructures of layered materials (912 citations)

10. Chip-integrated ultrafast graphene photodetector with high responsivity (889 citations)

The top ten is rounded out by two papers demonstrating that 2D materials including graphene can form the basis for highly efficient photodetectors.

----

tl;dr: The most highly cited photonics papers in 2013 were on topological photonics, metasurfaces, terahertz and graphene.

Friday, May 6, 2022

What's in a name?

Giving your model or result a catchy name greatly increases the impact of your research. 

Compare the citations of Two-dimensional massless electrons in an inverted contact and Quantum spin Hall effect.

The title you give your paper is important. Don't rush it.

On a related note, those working on soliton theory will be familiar with the nonlinear wave equation

$$\partial_t^2 \phi - \partial_x^2 \phi  + m^2 \sin \phi = 0,$$

which is called the Sine-Gordon equation "for obvious reasons" in Rubinstein's original analysis of its soliton solutions. A footnote in this paper gives credit for this brilliant name to Professor Martin Kruskal, who has an impressive list of scientific achievements spanning nonlinear waves, surreal numbers, and wormholes.

Thursday, April 28, 2022

Pessimal quantum algorithms

I was reading about sorting algorithms the other day and stumbled upon the amusing topic of pessimal algorithms, which are horribly slow algorithms that "[do] indeed progress steadily towards [their] stated goal even though [they] may have very little enthusiasm for (or even a manifest aversion to) actually getting there."

Two examples of pessimal sorting algorithms are slowsort and bogosort, which have non-polynomial scaling (factorial scaling in the case of bogosort). Slowsort uses a multiply-and-surrender approach, while bogosort chooses random permutations until it obtains the correctly-sorted list.

A simple quantum version of bogosort might be a circuit composed of a single Hadamard gate applied to each qubit, followed by measurement; this similarly samples uniformly over the solution space, repeating until the correct answer is measured.

More generally, pessimal quantum algorithms might exploit maliciously-tuned quantum interference to obtain scaling worse than the most pessimal classical algorithm, for example by engineering destructive interference to suppress the probability of measuring the correct solution.

Are there any known examples of pessimal quantum algorithms exhibiting a rigorously-proven quantum slow-down? Pessimists might point to certain variational quantum algorithms and approaches based on mapping efficiently-solvable problems to more difficult problems such as finding ground states of quantum spin networks as potential candidates.

Pessimal quantum algorithms could be one of the first killer applications of quantum computers, satisfying clients who want to find spooky quantum solutions to their problems (efficiency be damned!) and hardware developers seeking to maximise the utilisation of their shiny new quantum processors.


Monday, November 15, 2021

Orthogonality catastrophe on quantum devices

I recently read Walter Kohn's Nobel Lecture on density functional theory. It's an accessible introduction to a method that is hugely important for materials science. Sec. IIC makes some remarks on the orthogonality catastrophe, arguing that many-body wave functions are not meaningful for more than ~1000 particles because the exponential growth of the Hilbert space makes any approximation to a desired state have an exponentially-vanishing overlap with that state.

Quantum computational chemistry is promoted as one of the most promising potential applications of quantum computers, offering an exponential speed-up compared to classical algorithms. But there is a big caveat hidden behind these claims: Algorithms exhibiting rigorous speed-ups such as quantum phase estimation require as an input an approximation to the ground state wavefunction. The overlap with the ground state determines the success probability of the algorithm. Thus, computing the ground state energy of a large many body system will require an exponentially good approximation to the corresponding wavefunction, potentially killing the quantum speed-up.

Some experts in quantum chemistry considered this issue in an arXiv preprint a few years ago: Postponing the orthogonality catastrophe: efficient state preparation for electronic structure simulations on quantum devices. There the authors estimated that efficiently-computable approximations such as Hartee-Fock wavefunctions may provide a sufficiently good overlap for moderate system sizes of up to 40 electrons that are already beyond the reach of classical computations. But it seems quantum computers won't be a magic bullet for larger systems involving hundreds of electrons; it will still be necessary to use physical insight to decompose large systems into smaller tractable subsystems. 
 
It is curious that this study has only received 18 citations to date and was not published in a journal - there must be a story behind that...

Wednesday, November 3, 2021

Squeezed Spin States

Today I read the classic paper Spin Squeezed States, which was highlighted as a Milestone paper by Physical Review A last year.

In this paper the authors generalize squeezing transformations (usually considered in the context of bosonic modes) to spin S systems. Squeezing allows one to reduce the fluctuations along one spin axis, at the expense of increased fluctuations along another axis, enabling precision measurements with noise below the standard quantum limit.

The ability to use squeezing to reduce quantum noise is particularly important for near-term quantum processors, potentially enabling calculations to be carried out using fewer measurements. Writing in PRX: Quantum, Pezze and Smerzi last month proposed an improved quantum phase estimation algorithm employing spin squeezed states. The key advantage of spin squeezed states compared to an earlier algorithm employing GHZ states is their enhanced robustness to noise and losses.

It is interesting to consider how spin squeezing might be useful for improving the robustness to noise and performance of noisy intermediate-scale quantum algorithms...