Tuesday, September 17, 2013

Quantum theory reloaded

I have finally published a long-gestated piece in Nature (501, p154; 12 September) on quantum reconstructions. It has been one of the most interesting features I can remember working on, but was necessarily reduced drastically from the unwieldy first draft. Here (long post alert) is an intermediate version that contains a fair bit more than the final article could accommodate.


Quantum theory works. It allows us to calculate the shapes of molecules, the behaviour of semiconductor devices, the trajectories of light, with stunning accuracy. But nagging inconsistencies, paradoxes and counter-intuitive effects play around the margins: entanglement, collapse of the wave function, the effect of the observer. Can Schrödinger’s cat really be alive and dead at once? Does reality correspond to a superposition of all possible quantum states, as the “many worlds” interpretation insists?

Most users don’t worry too much about these nagging puzzles. In the words of the physicist David Mermin of Cornell University, they “shut up and calculate”. That is, after all, one way of interpreting the famous Copenhagen interpretation of quantum theory developed in the 1920s by Niels Bohr, Werner Heisenberg and their collaborators, which states that the theory tells us all we can meaningfully know about the world and that the apparent weirdness, such as wave-particle duality, is just how things are.

But there have always been some researchers who aren’t content with this. They want to know what quantum theory means – what it really tells us about the world it describes with such precision. Ever since Bohr argued with Einstein, who could not accept his “get over it” attitude to quantum theory’s seeming refusal to assign objective properties, there has been continual and sometimes furious debate over the interpretations or “foundations” of quantum theory. The basic question, says physicist Maximilian Schlosshauer of the University of Portland in Oregon, is this: “What is it about this world that forces us to navigate it with the help of such an abstract entity as quantum theory?”

A small community of physicists and philosophers has now come to suspect that these arguments are doomed to remain unresolved so long as we cling to quantum theory as it currently stands, with its exotic paraphernalia of wavefunctions, superpositions, entangled states and the uncertainty principle. They suspect that we’re stuck with seemingly irreconcilable disputes about interpretation because we don’t really have the right form of the theory in the first place. We’re looking at it from the wrong angle, making its shadow odd, spiky, hard to decode. If we could only find the right perspective, all would be clear.

But to find it, they say, we will have to rebuild quantum theory from scratch: to tear up the work of Bohr, Heisenberg and Schrödinger and start again. This is the project known as quantum reconstruction. “The program of reconstructions starts with some fundamental physical principles – hopefully only a small number of them, and with principles that are physically meaningful and reasonable and that we all can agree on – and then shows the structure of quantum theory emerges as a consequence of these principles”, says Schlosshauer. He adds that this approach, which began in earnest over a decade ago, “has gained a lot of momentum in the past years and has already helped us understand why we have a theory as strange as quantum theory to begin with.”

One hundred years ago the Bohr atom placed the quantum hypothesis advanced by Max Planck and Einstein at the heart of the structure of the physical universe. Attempts to derive the structure of the quantum atom from first principles produced Erwin Schrödinger’s quantum mechanics and the Copenhagen interpretation. Now the time seems ripe for asking if all this was just an ad hoc heuristic tool that is due for replacement with something better. Quantum reconstructionists are a diverse bunch, each with a different view of what the project should entail. But one thing they share in common is that, in seeking to resolve the outstanding foundational ‘problems’ of quantum theory, they respond much as the proverbial Irishman when asked for directions to Dublin: “I wouldn’t start from here.”

That’s at the core of the discontent evinced by one of the key reconstructionists, Christopher Fuchs of the Perimeter Institute for Theoretical Physics in Waterloo, Canada [now moved to Raytheon], at most physicists’ efforts to grapple with quantum foundations. He points out that the fundamental axioms of special relativity can be expressed in a form anyone can understand: in any moving frame, the speed of light stays constant and the laws of physics stay the same. In contrast, efforts to write down the axioms of quantum theory rapidly degenerate into a welter of arcane symbols. Fuchs suspects that, if we find the right axioms, they will be a transparent as those of relativity [1].

“The very best quantum-foundational effort”, he says, “will be the one that can write a story – literally a story, all in plain words – so compelling and so masterful in its imagery that the mathematics of quantum mechanics in all its exact technical detail will fall out as a matter of course.” Fuchs takes inspiration from quantum pioneer John Wheeler, who once claimed that if we really understood the central point of quantum theory, we ought to be able to state it in one simple sentence.

“Despite all the posturing and grimacing over the paradoxes and mysteries, none of them ask in any serious way, ‘Why do we have this theory in the first place?’” says Fuchs. “They see the task as one of patching a leaking boat, not one of seeking the principle that has kept the boat floating this long. My guess is that if we can understand what has kept the theory afloat, we’ll understand that it was never leaky to begin with.”

We can rebuild it

One of the earliest attempts at reconstruction came in 2001, when Lucien Hardy, then at Oxford University, proposed that quantum theory might be derived from a small set of “very reasonable” axioms [2]. These axioms describe how states are described by variables or probability measurements, and how these states may be combined and interconverted. Hardy assumes that any state may be specified by the number K of probabilities needed to describe it uniquely, and that there are N ‘pure’ states that can be reliably distinguished in a single measurement. For example, for either a coin toss or a quantum bit (qubit), N=2. A key (if seemingly innocuous) axiom is that for a composite system we get K and N by multiplying those parameters for each of the components: Kab = KaKb, say. It follows from this that K and N must be related according to K=N**r, where r = 1,2,3… For a classical system each state has a single probability (50 percent for heads, say), so that K=N. But that possibility is ruled out by a so-called ‘continuity axiom’, which describes how states are transformed one to another. For a classical system this happens discontinuously – a head is flipped to a tail – whereas for quantum systems the transformation can be continuous: the two pure states of a qubit can be mixed together in any degree. (That is not, Hardy stresses, the same as assuming a quantum superposition – so ‘quantumness’ isn’t being inserted by fiat.) The simplest relationship consistent with the continuity axiom is therefore K=N**2, which corresponds to a quantum picture.

But as physicist Rafael Sorkin of Syracuse University in New York had previously pointed out [3], there seems to be no fundamental reason why the higher-order theories (requiring N**3, N**4 measurements and so forth) should not also exist and have real effects. For example, Hardy says, the famous double-slit experiment for quantum particles adds a new behaviour (interference) where classical theory would just predict the outcome to be the sum of two single-slit experiments. But whereas quantum theory predicts nothing new on adding a third slit, a higher-order theory would introduce a new effect in that case – an experimental prediction, albeit one that might be very hard to detect experimentally.

In this way Hardy claims to have begun to set up quantum theory as a general theory of probability, which he thinks could have been derived in principle by nineteenth-century mathematicians without any knowledge of the empirical motivations that led Planck and Einstein to initiate quantum mechanics at the start of the twentieth century.

Indeed, perhaps the most startling aspect of quantum reconstruction is that what seemed to the pioneers of quantum theory such as Planck, Einstein and Bohr to be revolutionary about it – the quantization rather than continuum of energy – may in fact be something of a sideshow. Quantization is not an axiomatic concept in quantum reconstructions, but emerges from them. “The historical development of quantum mechanics may have led us a little astray in our view of what it is all about”, says Schlosshauer. “The whole talk of waves versus particles, quantization and so on has made many people gravitate toward interpretations where wavefunctions represent some kind of actual physical wave property, creating a lot of confusion. Quantum mechanics is not a descriptive theory of nature, and that to read it as such is to misunderstand its role.”

The new QBism

Fuchs says that Hardy’s paper “convinced me to pursue the idea that a quantum state is not just like a set of probability distributions, but very literally is a probability distribution itself – a quantification of partial belief, and nothing more.” He says “it hit me over the head like a hammer and has shaped my thinking ever since” – although he admits that Hardy does not draw the same lesson from the work himself.

Fuchs was particularly troubled by the concept of entanglement. According to Schrödinger, who coined the term in the first place, this “is the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought” [4]. In most common expositions of the theory, entanglement is depicted as seeming to permit the kind of instantaneous ‘action at a distance’ Einstein’s theory of relativity forbade. Entangled particles have interdependent states, such that a measurement on one of them is instantaneously ‘felt’ by the other. For example, two photons can be entangled such that they have opposed orientations of polarization (vertical or horizontal). Before a measurement is made on the photons, their polarization is indeterminate: all we know is that these are correlated. But if we measure one photon, collapsing the probabilities into a well-defined outcome, then we automatically and instantaneously determine the other’s polarization too, no matter how far apart the two photons are. In 1935 Einstein and coworkers presented this as a paradox intended to undermine the probabilistic Copenhagen interpretation; but experiments on photons in the 1980s showed that it really happens [5]. Entanglement, far from being a contrived quirk, is the key to quantum information theory and its associated technologies, such as quantum computers and cryptography.

But although quantum theory can predict the outcomes of entanglement experiments perfectly adequately, it still seems an odd way for the world to behave. We can write down the equations, but we can’t feel the physics behind them. That’s what prompted Fuchs to call for a fresh approach to quantum foundations [1]. His approach [6, 7] argues that quantum states themselves – the entangled state of two photons, say, or even just the spin state of a single photon – don’t exist as objective realities. Rather, “quantum states represent observers’ personal information, expectations and degrees of belief”, he says.

Fuchs calls this approach quantum Bayesianism or QBism (pronounced “cubism”), because he believes that, as standard Bayesian probability theory assumes, probabilities – including quantum probabilities – “are not real things out in the world; their only existence is in quantifying personal degrees of belief of what might happen.” This view, he says, “allows one to see all quantum measurement events as little ‘moments of creation’, rather than as revealing anything pre-existent.”

This idea that quantum theory is really about what we can and do know has always been somewhat in the picture. Schrödinger’s wavefunctions encode a probability distribution of measurement outcomes: what these measurements on a quantum system might be. In the Copenhagen view, it is meaningless to talk about what we actually will measure until we do it. Likewise, Heisenberg’s uncertainty principle insists that we can’t know everything about every observable property with arbitrarily exact accuracy. In other words, quantum theory seemed to impose limits on our precise knowledge of the state of the world – or perhaps better put, to expose a fundamental indeterminacy in our expectations of what measurement will show us. But Fuchs wants us to accept that this isn’t a question of generalized imprecision of knowledge, but a statement about what a specific individual can see and measure. We’re not just part of the painting: in a sense we are partially responsible for painting it.

Information is the key

The rise of quantum information theory over the past few decades has put a new spin on this consideration. One might say that it has replaced an impression of analog fuzziness (“I can’t see this clearly”) with digital error (“the answer might be this or that, but there’s such-and-such a chance of your prediction being wrong”). It is this focus on information – rather, knowledge – that characterizes several of the current attempts to rebuild quantum theory from scratch. As physicists Caslav Brukner and Anton Zeilinger of the University of Vienna put it, “quantum physics is an elementary theory of information” [8].

Jeffrey Bub of the University of Maryland agrees: quantum mechanics, he says, is “fundamentally a theory about the representation and manipulation of information, not a theory about the mechanics of nonclassical waves or particles” – as clear a statement as you could wish for of why early quantum theory got distracted by the wrong things. His approach to reconstruction builds on the formal properties of how different sorts of information can be ordered and permuted, which lie at the heart of the uncertainty principle [9].

In the quantum picture, certain pairs of quantities do not commute, which means that it matters in which order they are considered: momentum times position is not the same as position times momentum, rather as kneading and baking dough do not commute when making bread. Bub believes that noncommutativity is what distinguishes quantum from classical mechanics, and that entanglement is one of the consequences. This property, he says, is a feature of the way information is fundamentally structured, and it might emerge from a principle called ‘information causality’ [10], introduced by Marcin Pawlowski of the University of Gdansk and colleagues. This postulate describes how much information one observer (call him Bob) can gain about a data set held by another (Alice). Classically the amount is limited by what Alice communicates to Bob. Quantum correlations such as entanglement can increase this limit, but only within bounds set by the information causality postulate. Pawlowski and colleagues suspect that this postulate might single out precisely what quantum correlations permit about information transfer. If so, they argue, “information causality might be one of the foundational properties of nature” – in other words, an axiom of quantum theory.

Ontic or epistemic?

At the root of the matter is the issue of whether quantum theory pronounces on the nature of reality (a so-called ontic theory) or merely on our allowed knowledge of it (an epistemic theory). Ontic theories, such as the Many Worlds interpretation, take the view that wavefunctions are real entities. The Copenhagen interpretation, on the other hand, is epistemic, insisting that it’s not physically meaningful to look for any layer of reality beneath what we can measure. In this view, says Fuchs, God plays dice and so “the future is not completely determined by the past.” QBism takes this even further: what we see depends on what we look for. “In both Copenhagen and QBism, the wave function is not something “out there’”, says Fuchs. “QBism should be seen as a modern variant and refinement of Copenhagen.”

His faith in epistemic approaches to reconstruction is boosted by the work of Robert Spekkens, his colleague at the Perimeter Institute. Spekkens has devised a ‘toy theory’ that restricts the amount of information an observer can have about discrete ontic states of the system: specifically, one’s knowledge about these states can never exceed the amount of knowledge one lacks about them. Spekkens calls this the ‘knowledge balance principle’. It might seem an arbitrary imposition, but he finds that it alone is sufficient to reproduce many (but not all) of the characteristics of quantum theory, such as superposition, entanglement and teleportation [11]. Related ideas involving other kinds of restriction on what can be known about a suite of states also find quantum-like behaviours emerging [12,13].

Fuchs sees these insights as a necessary corrective to the way quantum information theory has tended to propagate the notion that information is something objective and real – which is to say, ontic. “It is amazing how many people talk about information as if it is simply some new kind of objective quantity in physics, like energy, but measured in bits instead of ergs”, he says. “You’ll often hear information spoken of as if it’s a new fluid that physics has only recently taken note of.” In contrast, he argues, what else can information possibly be except an expression of what we think we know?

“What quantum information gave us was a vast range of phenomena that nominally looked quite novel when they were first found”, Fuchs explains. For example, it seemed that quantum states, unlike classical states, can’t be ‘cloned’ to make identical copes. “But what Rob’s toy model showed was that so much of this vast range wasn’t really novel at all, so long as one understood these to be phenomena of epistemic states, not ontic ones”. Classical epistemic states can’t be cloned any more than quantum states can be, for much the same reason as you can’t be me.

What’s the use?

What’s striking about several of these attempts at quantum reconstruction is that they suggest that our universe is just one of many mathematical possibilities. “It turns out that many principles lead to a whole class of probabilistic theories, and not specifically quantum theory”, says Schlosshauer. “The problem has been to find principles that actually single out quantum theory”. But this is in itself a valuable insight: “a lot of the features we think of as uniquely quantum, like superpositions, interference and entanglement, are actually generic to many probabilistic theories. This allows us to focus on the question of what makes quantum theory unique.”

Hardy says that, after a hiatus following Fuchs’ call to arms and his own five-axiom proposal in the early 2000s, progress in reconstructions really began in 2009. “We’re now poised for some really significant breakthroughs, in a way that we weren’t ten years ago”, he says. While there’s still no consensus on what the basic axioms should look like, he is confident that “we’ll know them when we see them.” He suspects that ultimately the right description will prove to be ontic rather than epistemic: it will remove the human observer from the scene once more and return us to an objective view of reality. But he acknowledges that some, like Fuchs, disagree profoundly.

For Fuchs, the aim of reconstruction is not to rebuild the existing formalism of quantum theory from scratch, but to rewrite it totally. He says that approaches such as QBism are already motivating new experimental proposals, which might for example reveal a new, deep symmetry within quantum mechanics [14]. The existence of this symmetry, Fuchs says, would allow the quantum probability law to be re-expressed as a minor variation of the standard ‘law of total probability’ in probability theory, which relates the probability of an event to the conditional probabilities of all the ways it might come about. “That new view, if it proves valid, could change our understanding of how to build quantum computers and other quantum information kits,” he says.

Quantum reconstruction is gaining support. A recent poll of attitudes among quantum theorists showed that 60% think reconstructions give useful insights, and more than a quarter think they will lead to a new theory deeper than quantum mechanics [15]. That is a rare degree of consensus for matters connected to quantum foundations.

But how can we judge the success of these efforts? “Since the object is simply to reconstruct quantum theory as it stands, we could not prove that a particular reconstruction was correct since the experimental results are the same regardless”, Hardy admits. “However, we could attempt to do experiments that test that the given axioms are true.” For example, one might seek the ‘higher-order’ interference that his approach predicts.

“However, I would say that the real criterion for success are more theoretical”, he adds. “Do we have a better understanding of quantum theory, and do the axioms give us new ideas as to how to go beyond current day physics?” He is hopeful that some of these principles might assist the development of a theory of quantum gravity – but says that in this regard it’s too early to say whether the approach has been successful.

Fuchs agrees that “the question is not one of testing the reconstructions in any kind of experimental way, but rather through any insight the different variations might give for furthering physical theory along. A good reconstruction is one that has some ‘leading power’ for the way a theorist might think.”

Some remain skeptical. “Reconstructing quantum theory from a set of basic principles seems like an idea with the odds greatly against it”, admits Daniel Greenberger of the City College of New York. “But it’s a worthy enterprise” [16]. Yet Schlosshauer argues that “even if no single reconstruction program can actually find a universally accepted set of principles that works, it’s not a wasted effort, because we will have learned so much along the way.”

He is cautiously optimistic. “I believe that once we have a set of simple and physically intuitive principles, and a convincing story to go with them, quantum mechanics will look a lot less mysterious”, he says. “And I think a lot of the outstanding questions will then go away. I’m probably not the only one who would love to be around to witness the discovery of these principles.” Fuchs feels that could be revolutionary. “My guess is, when the answer is in hand, physics will be ready to explore worlds the faulty preconception of quantum states couldn’t dream of.”

1. Fuchs, C., http://arxiv.org/abs/quant-ph/0106166 (2001).
2. Hardy, L. E. http://arxiv.org/abs/quant-ph/0101012 (2003).
3. Sorkin, R., http://arxiv.org/pdf/gr-qc/9401003 (1994).
4. Schrödinger, E. Proc. Cambridge Phil. Soc. 31, 555–563 (1935).
5. A. Aspect et al., Phys. Rev. Lett. 49, 91 (1982).
6. Fuchs, C. http://arxiv.org/pdf/1003.5209
7. Fuchs, C. http://arxiv.org/abs/1207.2141 (2012).
8. Brukner, C. & Zeilinger, A. http://arxiv.org/pdf/quant-ph/0212084 (2008).
9. Bub, J. http://arxiv.org/pdf/quant-ph/0408020 (2008).
10. Pawlowski, M. et al., Nature 461, 1101-1104 (2009).
11. Spekkens, R. W. http://arxiv.org/abs/quant-ph/0401052 (2004).
12. Kirkpatrick, K. A. Found. Phys. Lett. 16, 199 (2003).
13. Smolin, J. A. Quantum Inform. Compu. 5, 161 (2005).
14. Renes, J. M., Blume-Kohout, R., Scott, A. J. & Caves, C. M. J. Math. Phys. 45, 2717 (2004).
15. Schlosshauer, M., Kofler, J. & Zeilinger, A. Stud. Hist. Phil. Mod. Phys. 44, 222–230 (2013).
16. In Schlosshauer, M. (ed.), Elegance and Enigma: The Quantum Interviews (Springer, 2011).

1 comment:

JimmyGiro said...

Who the Fuchs is Alice?

"That is a rare degree of consensus for matters connected to quantum foundations."

Isn't it odd, that Quantum Science, which gives fairly accurate accounts of observation, can lead to so much debate; yet 'climate science', which fails to predict things in a globally spectacular way, engenders a 97% consensus!?