Wednesday, October 05, 2011

Quantum renaissance

Here’s a piece I wrote for the latest issue of Prospect, where it is published with a few small changes. (At least, it started out along these lines - there were numerous iterations, and I somewhat lost track.)

Quantum mechanics is more than a hundred years old, but we still don’t understand it. Recent years have, however, seen a fresh enthusiasm for exploring the questions about what quantum theory means that were swept under the rug by its founders. Advances in experimental methods make it possible to test ideas about weird and counter-intuitive quantum effects and how they give rise to an apparently different set of physical laws at the everyday scale—that is, to examine in what sense things exist.

In 1900 the German physicist Max Planck suggested that light – a form of electromagnetic waves – consist of tiny, indivisible packets of energy. These particles, called photons, are the “quanta” of light. Five years later Albert Einstein showed how this quantum hypothesis explained the way light kicks electrons out of metals — the photoelectric effect (it was for this, not the theory of relativity, that he won his Nobel). The early pioneers of quantum theory quickly discovered that the seemingly innocuous idea that energy is grainy has bizarre implications. Objects can be in many places at once. Particles behave like waves and vice versa. The mere act of witnessing an event alters what is witnessed. Perhaps the quantum world is constantly branching into multiple universes.

As long as you just accept these paradoxes, quantum theory works fine and so scientists routinely adopt the approach memorably described by Cornell physicist David Mermin, as “shut up and calculate.” They use quantum mechanics to calculate everything from the strength of metal alloys to the shapes of molecules. Routine application of the theory underpins the miniaturization of electronics, medical MRI imaging and the development of solar cells, to name just a few burgeoning new technologies. Quantum mechanics is one of the most reliable theories in all of science: its predictions of how light and matter interact match experimental measurements to the eighth decimal place.

But the question of how to interpret the theory — what it tells us about the physical universe—was never resolved by founders such as Niels Bohr, Werner Heisenberg and Erwin Schrödinger. Famously, Einstein himself was unhappy about how quantum theory leaves so much to chance: it pronounces only on the relative probabilities of how the world is arranged, not on how things fundamentally are. Most physicists now accept something like Bohr and Heisenberg’s Copenhagen interpretation: there is no essential reality beyond the quantum description, nothing more fundamental and definite than probabilities. Bohr coined the notion of “complementarity” to express this need to relinquish the expectation of a deeper reality beneath the equations. If you measure a quantum object, you might find it in a particular state. But it makes no sense to ask if it was in that state before you looked. All that can be said is that it had a particular probability of being so. It’s not that you don’t “know,” but rather that the question has no physical meaning.

Einstein attacked this idea in a thought experiment in which two quantum particles were arranged to have interdependent states, whereby if one were aligned in one direction, say, then the other had to be aligned in the opposite direction. Suppose these particles are allowed to move many light years apart, and then you measure the state of one of them. Quantum theory insists that this instantly determines the state of the other. Again, it’s not that you simply don’t know until you measure. It is that the state of the particles is literally undecided until then. But this implies that the effect of the measurement is transmitted instantly, and therefore faster than light, across cosmic distances to the other particle. Surely that’s absurd, Einstein argued. But it isn’t. Experiments have now established beyond doubt that this instantaneous action at a distance, called entanglement, is real—that’s just how quantum mechanics is.

This is not an abstruse oddity. Quantum entanglement is being exploited in quantum cryptography, where a message is encoded in entangled quantum particles so that it is impossible to intercept and read the message without the tampering being detected. Entanglement is also being used in quantum computing, where the ability of quantum particles to exist in many states at once allows huge numbers of calculations to be conducted simultaneously, greatly accelerating the solution of certain mathematical problems. Although these technologies are still in early development, already there are signs of commercial interest. Earlier this year the Canadian company D-Wave Systems announced the first sale of a quantum computer to Lockheed Martin, while fibre-optic-based quantum cryptography was used (admittedly more for publicity than for extra security) to transmit ballot information in the 2007 Swiss parliamentary elections. “Discussions of relations between information and physical reality are now of interest not just because of foundational motivation but because such questions can have practical implications,” says Wojciech Zurek, a quantum theorist at the Los Alamos National Laboratory in New Mexico, US.

The quantum renaissance hinges mostly on experimental innovations. Until the 1970s, experiments on quantum fundamentals relied mostly on indirect inference. But now it’s possible to make and probe individual quantum objects with great precision. Many technological advances have contributed to this, among them the advent of laser light composed of photons of identical, precise energy, the ability to make measurements with immense precision in time, space and mass, methods to hold individual atoms in electrical and magnetic traps (the subject of the 1997 Nobel prize in physics), and the manipulation of light with fibre optics (motivated by developments in optical telecommunications). These same techniques have made quantum information technology, such as quantum cryptography and computing, viable.

Even if you accept the paradoxical aspects of quantum theory and just use the maths, the fundamental questions won’t go away. For example, if the act of measurement turns probabilities into certainties, how exactly does it do that? Physicists have long spoken of measurements “collapsing the wavefunction,” which expresses how the smeared-out, wave-like mathematical entity encoding all possible quantum states (the wavefunction) becomes focused into a particular place or state. But this was seen largely as metaphor. The collapse had to be imposed by fiat, since it didn’t feature in the mathematical theory. Many physicists, such as Roger Penrose of Oxford University, now believe that this collapse is a real physical event, rather like the radioactive decay of an atom. If so, it requires an ingredient that lies outside current quantum theory. Penrose argues that the missing element is gravity, and that we’d understand wavefunction collapse if only we could marry quantum theory to general relativity, one of the major lacunae in contemporary physics.

Physicist Dik Bouwmeester of the University of California at San Diego and his coworkers hope to test that idea by placing tiny mirrors in quantum ‘superposition’ states in which they are in several places at once, and then watch their wavefunction collapse into a single location, triggered by a ‘measurement’ in which photons are reflected from them. Ignacio Cirac and Oriol Romero-Isart at the Max Planck Institute for Quantum Optics in Garching, Germany, recently outlined an experimental method for placing nano-sized objects of about a nanometre in size, containing thousands or millions of atoms, into superposition states using light to trap and probe them, which would allow tests of such wavefunction-collapse theories.

Wavefunction collapse is part of the reason why the world doesn’t follow quantum rules all the way up. If it did, they wouldn’t seem counter-intuitive at all. It’s only because we’re used to our coffee cups being on our desk or in the dishwasher, but not both at once, that it seems so unreasonable for photons or electrons not to behave that way. At some scale, the quantum-ness of the microscopic world gives way to classical, Newtonian physics. Why? The generally accepted answer is a process called decoherence: crudely speaking, interactions of a quantum entity with its teeming environment act rather like a measurement, collapsing superpositions into a well-defined state. In this view, large objects obey classical physics not because of their size as such but because they contain more particles and thus experience more interactions, so decohering instantly.

But that doesn’t fully resolve the issue—as shown by Schrödinger’s famous cat. In his thought experiment, Erwin Schrödinger imagined a cat that is poisoned, or not, depending on the outcome of a single quantum event, all of which is concealed inside a box. Since the outcome of the event is undetermined until observation collapses the wavefunction, quantum theory seemed to insist that, until the box is opened, the cat would be both alive and dead. Physicists used to evade that absurdity by insisting that somehow the bigness of the cat would bring about decoherence even without any observation, so that it would be either alive or dead but not both.

Yet one can imagine suppressing decoherence by creating a Schrödinger cat experiment that is well isolated from its surroundings. Then what? Ask old-school “shut up and calculate” physicists if the cat can be simultaneously alive and dead, and they are likely to assert that this will still be censored somehow or other. But younger physicists may well answer “why not?”

Perhaps we can simply do the experiment. The size of a cat makes it still nigh impossible to suppress decoherence, but a microscopic “cat” is more amenable to isolation. Cirac and Romero-Isart have proposed an experiment in which the cat is replaced by a virus, held in a light trap and coaxed by laser light into a quantum superposition of states. They say it might even work for tiny aquatic animals called tardigrades or water bears, which, unlike viruses, are unambiguously living or dead. It’s not obvious how to set up an experiment like Schrödinger’s, but simply placing a living creature in two places at once would be mind-boggling enough.

For whatever reason, the fact is that everyday objects are always in a single state and we can make measurements on them without altering that state: we have never sighted a Schrödinger cat. Physicists Anthony Leggett, a Nobel laureate at the University of Illinois, and Anupam Garg of Northwestern University, also in Illinois, call these conditions macrorealism. But is our classical world truly macrorealistic, or does it just look that way? Leggett and Garg showed in theory how to distinguish a macrorealistic world from one that isn’t. Such tests are even tougher to conduct than those on wavefunction collapse, says Romero-Isart, but he thinks that his proposed experiment on nano-objects could make a start.

Zurek, meanwhile, has developed a theory of how a fundamentally quantum world can look classical without really being macrorealistic. Whereas measuring a quantum system will alter it, classical systems can be probed without changing them: fifty people can read this text without thereby spontaneously altering the words. But in Zurek’s scheme, this may be true of quantum systems too if they can leave many imprints on their environment (which is actually what we observe). Each observer comes along and sees an imprint, and because each is the same, they all agree on what properties the system has. But only certain quantum states have this ability to create many identical imprints—in a sense these robust states are thus “selected” in a quasi-darwinian way, and so out of all the possible quantum attributes of the system, these are the ones we ascribe to the object. It’s as though a ripe apple can create lots of redness imprints, which enable us to agree that it is red, while also possessing other quantum attributes that can’t be assigned a definite value in this way.

Zurek says this means that the environment of an object is not an “innocent bystander who simply reports what has happened”, but rather, “an accomplice in the ‘crime’, selecting and transforming some of the fragile quantum states into robust, objectively existing classical states.” Ideas like this, however strange they might sound at first, can be made consistent with current quantum theory precisely because that theory leaves so much unanswered. But perhaps not for much longer.

1 comment:

Paolo said...

Dear Friends , I hope in your adhesion to the following Egocreanet Iniciative. Paolo 25 OCT/2011
OPEN DEBATE ON : Advances in nano-scale-science includes to focus the Entanglement in nano and bio-molecular systems .;

<< Quantum Entanglement becomes the structure that connects
as in 'ancient philosophy of the Tao was the concept of Yin / Yang >>

The Goal of EGOCREANET ( is to open a debate to sharing innovation and hypotheses on the future of nano-science to provide a forum for researchers and industry managers to exchange and discuss latest and future design and analysis for nano-scale science .

Namely, we are interested in establishing the feasibility and characteristics of Quantum - Entanglement in nano-scale dimension opening a networking through the trans-disciplinary application of the concept of entanglement exteded to cultural and science design, modelling in art and science and visual simulation of nano-scale innovation.
A particular focus of such open debate will be to the issues of Quantum Entenhlement theory regarding , quantum molecular entanglement in nano-scale,and to other relevant topics including: quantum coherence in delocalized bond structures and quantum entanglement in nanoscale dot- systems etc.. . Paolo Manzelli OCT/24/2011-Firenze

-- .

One of most peculiar properties of Quantum Physiscs is focused on the Entanglement that get the possibility to built up special quantum shared states based on delocalized electon's field.
In fact Entanglement permits to change the degre of localization of quantum/wave particles ;in fact also during a spatial separation of pairwise electrons quantum entanglement generate a new partial localized conjugate-systems of bonding atoms.
The entanglement activity can evolve in strenght and in coherence of sinultanety properties of mixed delocalized states and/or in the successive decay to localized single states in function of some noises ( temperature and other interferences) that dis-entangled the stability in the time-scale of the simultaneity co-existence of entangled states .

To investigate on the properties innovation of entanglement effects I think that good experimental information can be obtaied looking at the spectum of emission induced by lasers and measured in femto-seconds (Femto-chemistry *).
In fact this fast-method of investigation can give information observed exactly what happens at the molecular level during a chemical reaction. So that ultrafast molecular dynanics in future can permit to deeply understand the effect of entangled hybridization of elecron's field ( in some way similar to the mettallic bond) caused by the overposition of electron orbitals to create the molecular bonding in the nano-scale dimension.

Paolo Manzelli <>