Tuesday, January 14, 2014

The future of physics

Research Funding Insight recently asked me to write a piece on the “future of physics”, to accompany a critique of string theory and its offshoots by Jim Baggott (see below). I wanted to take the opportunity to explain that, whatever the shortcomings of string theory might be, they most certainly do not leave physics as a whole in crisis. It is doing very nicely, because it is much, much broader than both string theory in particular and what gets called “fundamental physics” in general. So here it is. The article first appeared in Funding Insight on 7 January 2014, and I’m reproducing it here with kind permission of Research Professional. For more articles like this (including Jim Baggott’s), visit www.researchprofessional.com.


Is physics at risk of disappearing up its own foundations? To read some of the recent criticisms of work on string theory, which seeks a fundamental explanation of all known forces and particles, you might think so. After about three decades of work, the theory is no closer to a solution – or rather, the number of possible solutions has mushroomed astronomically, while none of them is testable and they all rest on a base of untried speculations.

But while scepticism about the prospects for this alleged Theory of Everything may be justified, it would be mistaken to imagine that the difficulties are gnawing away at the roots of physics. They are the concern of only a tiny fraction of physicists, while many others consider them esoteric at best and perhaps totally irrelevant.

Don’t imagine either that the entire physics community has been on tenterhooks to see what the Large Hadron Collider at CERN in Geneva will come up with, or whether, now that it seems to have found the Higgs boson, the particle accelerator will open up a new chapter in fundamental physics that takes in such mysterious or speculative concepts as dark matter and supersymmetry (a hitherto unseen connection between different classes of particles).

Strings and the LHC are the usual media face of physics: what most non-physicists think physicists do. This sometimes frustrates other physicists intensely. “High-energy physics experiments are over-rated, and are not as significant as they were decades ago”, says one, based in the US. “Now it is tiny increments in knowledge, at excessive costs – yet these things dominate the science news.”

Given the jamboree that has surrounded the work at the LHC, especially after the award of the 2013 Nobel prize in physics to Peter Higgs (with François Englert, who also proposed the particle now known by Higgs’ name), it is tempting to dismiss this as sour grapes. But there’s more to it than the resentment of one group of researchers at seeing the limelight grabbed by another. For the perception that the centre of gravity of physics lies with fundamental particles and string theory reflects a deep misunderstanding about the whole nature of the discipline. The danger is that this misunderstanding might move beyond the general public and media and start to infect funders, policy-makers and educationalists.

The fact is that physics is not a quest for isolated explanations of this or that phenomenon (and string theory, for all its vaunted status as a Theory of Everything, is equally parochial in what it might ‘explain’). Physics attempts to discover how common principles apply to many different aspects of the physical world. It would be foolish to suppose that we know what all these principles are, but we certainly know some of them. In a recent article in Physics World, Peter Main and Charles Tracy from the Institute of Physics’ education section made a decent stab at compiling a list of what constitutes “physics thinking”. It included the notions of Reductionism, Causality, Universality, Mathematical Modelling, Conservation, Equilibrium, the idea that differences cause change, Dissipation and Irreversibility, and Symmetry and Broken Symmetry. There’s no space to explain all of these, but one might sum up many of them in the idea that things change for identifiable reasons; often those reasons are the same in different kinds of system; we can develop simplified maths-based descriptions of them; and when change occurs, some things (like total energy) stay the same before and after.

Many of these notions are older than is sometimes supposed. Particle physicists, for example, have been known to imply that the concept of symmetry-breaking – whereby a system with particular symmetry properties appears spontaneously from one with more symmetry – was devised in the 1950s and 60s to answer some problems in their field. The truth is that this principle was already inherent in the work of the Dutch scientist Johannes Diderik van der Waals in 1873. Van der Waals wasn’t thinking about particle physics, which didn’t even exist then; he was exploring the way that matter interconverts between liquid and gas states, in what is called a phase transition. Phase transitions and symmetry breaking have since proved to be fundamental to all areas of physics, ranging from the cosmological theory of the Big Bang to superconductivity. Looked at one way, the Higgs boson is the product of just another phase transition, and indeed some of the ideas found in Higgs’ theory were anticipated by earlier work on the low-temperature transition that leads to resistance-free electrical conductivity in some materials (called superconductivity).

Or take quantum theory, which began to acquire its modern form when in 1926 Erwin Schrödinger wrote down a ‘wavefunction’ to describe the behaviour of quantum particles. Schrödinger didn’t just pluck his equation from nowhere: he adapted it from the centuries-old discipline of wave mechanics, which describes what ordinary waves do.

This is not to say that physicists are always stealing old ideas without attribution. Quite the opposite: it is precisely because they were so thoroughly immersed in the traditions and ideas of classical physics, going back to Isaac Newton and Galileo, that the physicists of the early twentieth century such as Einstein, Max Planck and Niels Bohr were able to instigate the revolutionary new ideas of quantum theory and relativity. All the best contemporary physicists, such as Richard Feynman and the Soviet Lev Landau, have had a deep appreciation of the connections between old and new ideas. Feynman’s so-called path-integral formulation of quantum electrodynamics, which supplied a quantum theory of how light interacts with matter, drew on the eighteenth-century classical mechanics of Joseph Louis Lagrange. It is partly because they point out these connections that Feynman’s famous Lectures on Physics are so revered; the links are also to be found, in more forbidding Soviet style, in the equally influential textbooks by Landau and his colleague Evgeny Lifshitz.

The truly profound aspect of central concepts like those proposed by Main and Tracy is that they don’t recognize any distinctions of time and space. They apply at all scales: to collisions of atoms and bumper cars, to nuclear reactions and solar cells. It seems absurd to imagine that the burst of ultrafast cosmic expansion called inflation, thought to be responsible for the large-scale structure of the universe we see today, has any connection with the condensation of water on the windowpane – but it does. Equally, that condensation is likely to find analogies in the appearance of dense knots and jams in moving traffic. Looked at this way, what is traditionally called fundamental physics – theories of the subatomic nature of matter – is no more fundamental than is the physics of sand or sound. It merely applies the same concepts at smaller scales.

This, then, is one important message for physics education: don’t teach it as a series of subdisciplines with their own unique set of concepts. Or if you must parcel it up in this way, keep the connections at the forefront. It’s also a message for students: always consider how the subject you’re working on finds analogues elsewhere.

All this remains true even while – one might even say especially as – physics ventures into applied fields. It’s possible (honestly) to see something almost sublime in the way quantum theory describes the behaviour of electrons in solids such as the semiconductors of transistors. On one level it’s obvious that it should: quantum theory describes very small things, and electrons are very small. But the beauty is that, under the auspices of quantum rules, electrons can get marshalled into states that mirror those in quite different and more exotic systems. They can acquire ‘orbits’ like those in atoms, so that blobs of semiconductor can act as artificial atoms. They can get bunched into pairs or other groups that travel in unison, giving us superconductivity, itself analogous to the weird frictionless superfluid behaviour of liquid helium. One of the most interesting features of the atom-thick carbon sheets called graphene is not that they will provide new kinds of touch-screen (we have those already) but that their electrons, partly by virtue of being trapped in two dimensions, can collectively behave like particles called Dirac fermions, which have no mass and move at the speed of light. The electrons don’t actually do this – they just ‘look’ like particles that do. In such ways, graphene enables experiments that seem to come from the nether reaches of particle physics, all in a flake of pencil lead on a desktop.

As graphene promises to show, these exotic properties can feed back into real applications. Other electronic ‘quasiparticles’ called excitons (a pairing of an electron with a gap or ‘hole’ in a pervasive electron ‘sea’) are responsible for the light emission from polymers that is bringing flexible plastics to screens and display technology. In one recent example, an exotic form of quantum-mechanical behaviour called Bose-Einstein condensation, which has attracted Nobel prizes after being seen in clouds of electromagnetically trapped ultracold gas, has been achieved in the electronic quasiparticles of an easily handled plastic material at room temperature, making it possible that this once arcane phenomenon could be harnessed cheaply to make new kinds of laser and other light-based devices.

There is a clear corollary to all this for allocating research priorities in physics: you never know. However odd or recondite a phenomenon or the system required to produce it, you never know where else it might crop up and turn out to have uses. That of course is the cliché attached to the laser: the embodiment of a quirky idea of Einstein’s in 1917, it has come to be almost as central to information technology as the transistor.

Does this mean that physics, by virtue of its universality, can in fact have no priorities, but must let a thousand flowers bloom? Probably the truth is somewhere in between: it makes sense, in any field of science, to put some emphasis on areas that look particularly technologically promising or conceptually enriching, as well as curbing areas that seem to have run their course. But it would be a mistake to imagine that physics, any more than Darwinian evolution, has any direction – that somehow the objective is to work down from the largest scales towards the smaller and more ‘fundamental’.

Another reason to doubt the overly reductive approach is supplied by Michael Berry, a distinguished physicists at the University of Bristol whose influential work has ranged from classical optics and mechanics to quantum chaos. “There are different kinds of fundamentality”, says Berry. “As well as high-energy and cosmology, there are the asymptotic regimes of existing theories, where new phenomena emerge, or lurk as borderland phenomena between the theories.” Berry has pointed out that an ‘asymptotic regime’ in which some parameter in a theory is shrunk to precisely zero (as opposed to being merely made very small), the outcomes of the theory can change discontinuously: you might find some entirely new, emergent behaviour.

As a result, these ‘singular limits’ can lead to new physics, making it not just unwise but impossible to try to derive the behaviour of a system at one level from that at a more ‘fundamental’ level. That’s a reason to be careful about Main and Tracy’s emphasis on reductionism. Some problems can be solved by breaking them down into simpler ones, but sometimes that will lose the very behaviour you’re interested in. “If you don’t think emergence is important too, you won't get far as a condensed matter physicist”, says physicist Richard Jones, Pro-Vice-Chancellor for Research and Innovation at the University of Sheffield.

It’s important to recognize too that the biggest mysteries, however alluring they seem, may not be the most pressing, nor indeed the most intellectually demanding or enriching. The search for dark matter is certainly exciting, well motivated, and worth pursuing. But at present it is only rather tenuously linked to the mainstream of ideas in physics – we have so few clues, either observationally or theoretically, about how to look or what we hope to find, that it is largely a matter of blind empiricism. It is usually wise not to spend too much of your time stumbling around in the dark.

With all this in mind, here are a few suggestions for where what we might call ‘small physics’ might usefully devote some of its energies in the coming years:

- quantum information and quantum optics: even if quantum computers aren’t going to be a universal game-changer any time soon, the implications of pursuing quantum theory as an information science are vast, ranging from new secure communications technologies to deeper insights into the principles that really underpin the quantum world.

- the physics of biology: this can mean many things, from understanding how the mechanics of cells determine their fate (stem cells sometimes select their eventual tissue type from how the individual cells are pulled and tugged) to the question of whether phase transitions underpin cancer, brain activity and even natural selection. This one needs handling with care: physicists are likely to go badly astray unless they talk to biologists.

- materials physics: from new strong materials to energy generation and conversion, it is essential to develop an understanding of how materials systems behave over a wide range of size scales (and that’s not necessarily a problem to tackle from the bottom up). Such knowhow is likely to be central to a scientific basis for sustainability.

- new optical technologies: you’ve probably heard about invisibility cloaks, and while some of those claims need to be taken with a pinch of salt, the general idea that light can be moulded, manipulated and directed by controlling the microstructure of materials (such as so-called photonic band-gap materials and metamaterials) is already leading to new possibilities in display technologies, telecommunications and computing.

- electronics: this one kind of goes without saying, perhaps, but the breadth and depth of the topic is phenomenal, going way beyond ways to make transistors ever smaller. There is a wealth of weird and wonderful behaviour in new and unusual materials, ranging from spintronics (electronics that uses the quantum spins of electrons), molecular and polymer electronics, and unusual electronic behaviour on the surfaces of insulators (check out “topological insulators”).

None of this is to deny the value of Big Physics: new accelerators, telescopes, satellites and particle detectors will surely continue to reveal profound insights into our universe. But they are only part of a bigger picture.

Most of all, it isn’t a matter of training physicists to be experts in any of these (or other) areas. Rather, they need to know how to adapt the powerful tools of physics to whatever problem is at hand. The common notion (or is it just in physics?) that a physicist can turn his or her hand to anything is a bit too complacent for comfort, but it is nonetheless true that a ‘physics way of thinking’ is a potential asset for any science.

1 comment:

JimmyGiro said...

Physics is full of blokes that are very clever at finding the truth of things. None of which is very popular with a State that thinks feminism and sodomy are virtues...