Thursday, April 28, 2022

Pessimal quantum algorithms

I was reading about sorting algorithms the other day and stumbled upon the amusing topic of pessimal algorithms, which are horribly slow algorithms that "[do] indeed progress steadily towards [their] stated goal even though [they] may have very little enthusiasm for (or even a manifest aversion to) actually getting there."

Two examples of pessimal sorting algorithms are slowsort and bogosort, which have non-polynomial scaling (factorial scaling in the case of bogosort). Slowsort uses a multiply-and-surrender approach, while bogosort chooses random permutations until it obtains the correctly-sorted list.

A simple quantum version of bogosort might be a circuit composed of a single Hadamard gate applied to each qubit, followed by measurement; this similarly samples uniformly over the solution space, repeating until the correct answer is measured.

More generally, pessimal quantum algorithms might exploit maliciously-tuned quantum interference to obtain scaling worse than the most pessimal classical algorithm, for example by engineering destructive interference to suppress the probability of measuring the correct solution.

Are there any known examples of pessimal quantum algorithms exhibiting a rigorously-proven quantum slow-down? Pessimists might point to certain variational quantum algorithms and approaches based on mapping efficiently-solvable problems to more difficult problems such as finding ground states of quantum spin networks as potential candidates.

Pessimal quantum algorithms could be one of the first killer applications of quantum computers, satisfying clients who want to find spooky quantum solutions to their problems (efficiency be damned!) and hardware developers seeking to maximise the utilisation of their shiny new quantum processors.


Tuesday, April 26, 2022

The changing face of science philanthropy

Science published an editorial earlier this month on the changing priorities and methods of science philanthropy: New goals for science philanthropy.

The article discusses how growing priorities among donors are to work with public institutes and funding agencies to identify research areas where philanthropy can make the most impact, and to support researchers underrepresented in funding allocations from traditional sources. It is estimated that a staggering 42% of funding for basic science at US institutes is funded by philanthropy.

Each month NUS circulates a list of grant opportunities. Most private sources of funding are focused on the life sciences, particularly research into various diseases. 

How can the physical sciences attract more philanthropic funding? Two observations:

First, donors want to see meaningful impact in their lifetime. This is particularly challenging for physics, where most basic research does not lead to real world applications, and if it does it may take decades. For example, fusion power has been only 10-20 years away since the 1950s. Unfortunately it is impossible to control the impact of a specific research project.

Second, donors prefer to fund areas to which they have a personal connection. The biggest donors to the physical sciences are from those with experience in this area. For example, after obtaining a PhD in mathematics Jim Simons spent a decade in academia before turning to finance and making his fortune. Similarly, Fred Kavli had a background in engineering and Yuri Milner (founder of the Breakthrough Prize) originally worked as a physicist.

One way forward discussed in the article is to provide opportunities for research beyond traditional venues, for example by setting up independent research centres such as the Flatiron Institute. This not only avoids the entranched bureaucracy and overheadss of high-profile universities, but also gives the donor more control over the culture and expenditure of the centre. This seems a better alternative to funding lavish prizes that are more about recognising existing research impact than bringing about tomorrow's research breakthroughs.  


Friday, April 22, 2022

Australia's scientific brain drain

This opinion piece published in the Sydney Morning Herald the other day outlines a common experience for many researchers in Australia - the lack of a coherent government research and innovation strategy forces PhD graduates overseas for postdoctoral research. Many never return.

I don't want to repeat what is already said in this article, so I will instead add a few more examples of mismanagement I've seen in recent years:

  • Multiple recipients of highly competitive Australian Research Council grants have had to wait for more than a year for their Australian work visas to be approved. Your project was ranked in the top 10% of submissions and "enhances the scale and focus of research in Australian Government priority areas", but you'll have to wait at the back of the visa queue, thanks!
  • During the height of the covid pandemic foreigners were basically barred from entering Australia (requiring a visa to be approved, travel ban exemption, and one of few seats on incoming flights). At the same time, many Australians working in postdoctoral positions overseas decided to return (at great cost) and were desperate to find a local research job. Inflexible funding rules meant that certain research positions were being limited to candidates not in Australia and unable to travel there.
  • The current government is prioritizing funding projects that can establish links between research and local industry. In many promising research areas there is no local industry to speak of, leading to world-class scientific proposals not being funded. And I'm not talking about the ARC Linkage grants (which are aimed at funding these kinds of partnerships), this is in relation to the Discovery Programme, which is supposed to fund pure science.

To sum up, I can respect that different governments may have different priorities for funding (or not funding) certain research. But it's appalling to see so many examples of policy being implemented so poorly.

Wednesday, April 20, 2022

PsiQuantum's take on quantum chemistry using quantum computers

PsiQuantum takes a contrarian view on quantum computing: forget NISQ, focus on fault-tolerant systems with millions of qubits and error correction.

A few weeks ago PsiQuantum published a paper analyzing the resources required to carry out commercially-relevant simulations of battery chemistry: Fault-tolerant resource estimate for quantum chemical simulations: Case study on Li-ion battery electrolyte molecules. They also posted a summary on their website.

So far I only had a chance to skim the paper, but as someone focusing on NISQ algorithms I found the different approaches required for fault-tolerant algorithms quite interesting. For example, I had thought that state preparation might be a bottleneck in applying quantum phase estimation to classically-intractable quantum systems, but this doesn't appear to be a problem for high precision quantum chemistry applications:

Concerns over the viability of preparing such an ansatz efficiently are encapsulated by the “orthogonality catastrophe,” the observation that the overlap between the true ground state and some ansatz wave function decreases exponentially as the system size increases. However, it has been shown that there are sophisticated classical methods for preparing trial wavefunctions with sufficient overlap, for states of up to O(100) orbitals using simple-to-prepare states such as the single Slater determinant obtained from the Hartree-Fock method (alternatively, methods for multideterminant state preparation can be used) [arXiv:1809.05523]

Another part of the paper I found interesting was Appendix C, which presents a logarithmic-depth circuit for state preparation using Givens rotations. From what I understand, this construction is limited to the fault-tolerant regime, since it requires multi-qubit Pauli operations.

Thursday, April 14, 2022

PhD position openings in topological photonics and turbulence in fluids of light

Two ERC-funded PhD position openings in the group of Alberto Amo in Lille, France on the following subjects:

1. Nonlinear topological phases in lattices of coupled micropillars (experimental)

The goal of this thesis is to study experimentally novel topological phases in photonic lattices. We will employ coupled semiconductor micropillars in which photons can hop from site to site. Taking advantage of the extraordinary photon-photon interactions in this system we aim at observing nonlinear topological effects. This PhD thesis is part of the ERC Consolidator grant EmergenTopo, and will be realized in collaboration with French and international theoreticians.


2. Turbulence in polariton fluids (experimental).

The goal of this thesis is to experimentally investigate the turbulent properties of a polariton fluid when passing and obstacle engineered in the microcavity. In a polariton superfluid, not only friction is strongly reduced, but vortices have very special properties: their circulation is quantised and it can only take integer values. Up to now, the main drawback to study turbulent phenomena has been the need of ultrafast single shot measurements, with time resolution in the picosecond scale. Using cutting-edge time resolved spectroscopic techniques developed in our group, we will study the emergence of vortices and other turbulent phenomena in a polariton fluid. We will investigate turbulence in obstacles of different shapes, including obstacles with the form of a plane wing to answer the question of whether a plane can fly in a superfluid of light.

Application deadline: 10th May 2022
Starting date: 1st October 2022 (negotiable)

Monday, April 11, 2022

Generalised coherent states for variational quantum circuits

One challenge with designing variational quantum algorithms is choosing a good ansatz for the parameterised quantum circuit. One needs to balance multiple competing requirements: 

1. The circuit needs to be complex enough to be able to express a good solution to the problem at hand.

2. The parameterisation should be simple enough to admit analytical calculation of gradients, so that the solution can be found efficiently using gradient descent.

3. The circuit should generate non-trivial quantum entanglement, which is a necessary (but not sufficient) condition for a classical computer to be inefficient at obtaining the same solution.

The most common choices for parameterised circuits either do not meet all three requirements, or give rise to very deep circuits that cannot be run on current quantum processors.

Last week I saw an interesting preprint posted to arXiv:

A Variational Ansatz for the Ground State of the Quantum Sherrington-Kirkpatrick Model

This work builds on an earlier paper by the same group: Generalization of group-theoretic coherent states for variational calculations.

The authors consider variational quantum states based on generalised coherent states, which are formed by applying a single qubit and two qubit (entangling) rotations. Crucially, the form of the ansatz enables expectation values of Pauli operators (required to obtained energies and gradients) to be measured efficiently, with the required number of measurements scaling polynomially in the system size. At the same time, the non-trivial entanglement of the generalised coherent states means that evaluating the overlap between different parameterised states is a classically hard task.

These properties seem quite promising for quantum kernel methods: the analytical gradients would allow one to optimise key properties of the quantum feature map, while the model output (quantum state overlaps with the training data) remains hard to simulate classically.

It is interesting to speculate on whether these parameterised quantum states could be adapted form the basis for a quantum perceptron; they are simple and efficiently trainable, but I am not sure whether they are also modular and can be stacked to form a deep quantum neural network.

Thursday, April 7, 2022

Quantum computing from a business perspective

A refreshingly measured and hype-avoiding review article on quantum computing targeted at non-specialists and without any equations was posted to arXiv last week: Quantum computing at the quantum advantage threshold: a down-to-business review

Some quotes and my additional comments:

"Quantum computers should be seen not as competitors to classical machines, but rather as a supplementary class of devices aimed to tackle a distinct class of problems."

Think of a future quantum computer as being like a GPU or ASIC rather than a faster version of a general purpose PC.

[on NISQ algorithms] "For example, the estimation of the energy of a relatively simple molecule Fe2S2 would require as many as 10^13 measurements. Assuming (optimistically) that each quantum circuit run takes 10 ns, the single iteration would require about 24 hours."

Quantum computers won't solve hard problems instantly; they will solve (slowly) problems intractable using conventional computers. They probably won't be useful for optimizing the routes a delivery driver takes each day.

"In some cases, hopes to achieve quantum advantage by reducing an optimization problem to QUBO appear unviable altogether because of an exponential overhead associated with such reduction."

Another issue with some of the quadratic unconstrained binary optimisation (QUBO) mappings proposed in the scientific literature is they end up converting problems that are efficiently-solvable on a classical computer into an NP Hard problem that is much harder to solve in order to map it onto a quantum circuit; in effect taking one step forward and many steps backwards.


Tuesday, April 5, 2022

The perfect postdoc

It's natural for graduating PhD students to agonise over their next position.

Are my career prospects doomed If I stay close to home? Do I need to do a postdoc at a top US university to guarantee a faculty position? Would I be better off taking up a position in industry? What if I screw up and it doesn't work out?

No matter where you end up, you will gain valuable experience and create new opportunities for the future.

An infamous example of this is Nick Leeson, whose fraudulent stock trading brought down his bank and led to his imprisonment in Singapore. He now makes money speaking and consulting on risk management and fraud detection!