Fuelling the sceptics?
[Here’s the long version of my Lab Report column in the June issue of Prospect.]
Has the Intergovernmental Panel on Climate Change (IPCC) got its numbers wrong? That’s what a recent paper in Nature seems to be saying, to the delight of climate sceptics everywhere. Whereas the IPCC report forecast a rise in global mean temperature of around 0.2-0.3 oC per decade, researchers in Germany found from a sophisticated computer model of climate that temperatures are likely to remain flat until around 2015, as they have done since about 1998.
The sceptics will argue that this shows we don’t have much of a clue about climate, and all the dire forecasts from models count for nothing. That, however, would be like saying that, because we took a wrong turn on the road from London to Edinburgh, we have no idea where Edinburgh is.
There is actually nothing in the new result that conflicts with the IPCC’s position, which has always acknowledged that the poorly understood natural variability of the climate system will superimpose its imprint on the global warming trend. The new findings are an attempt to forecast short-term, decade-scale temperature changes, rather than the longer-term changes usually considered by climate modellers. Over a decade or two, temperatures are much more susceptible to natural variations (which boosted warming in the late 1990s). The current cooling influence is due to weakening of heat-bearing ocean currents such as the Gulf Stream. This may persist for about a decade, but then the warming will resume, and by 2030 it should reconnect with the IPCC predictions.
No reason, then, to throw out all the climate models. Indeed, no climate scientist seems particularly surprised or perturbed by the findings, which simply flesh out the short-term picture. To climate sceptics, this is mere dissembling and backpedalling, although their own determination to undermine the IPCC’s predicted trend never identified anything of the sort. It’s a curious logic that uses climate modelling to discredit climate models.
Science policy maven Roger Pielke Jr of the University of Colorado, a sceptic in the proper sense, has justifiably demanded how the models can be validated when they are seemingly predicting one thing one moment, and the opposite the next. But the answer is that natural variability compromises any short-term predictions – a frustrating fact of life that demands great care in framing the right questions and drawing conclusions. Certainly, we should remain wary of claims that a few hot summers, or a few more hurricanes, prove that global catastrophe is imminent, just as we should of suggestions that a few relatively cool years rubbish the IPCC’s forecasts.
****
We must be wary too of making global warming a deus ex machina that explains every environmental trend, especially if it’s bad. Droughts and storms worsened by climate change may be playing a small part in the global food crisis, but a far bigger problem comes from attempts to mitigate such change with biofuels. In 2006, a fifth of US maize was grown to make ethanol, not food. With the US providing 70 percent of global maize exports, grain prices worldwide were sure to feel the effect.
The rush towards an ill-considered biofuels market is a depressing reminder that the vicissitudes of climate science are as nothing compared with the lack of foresight in the economic system that rides on it. The passion for biofuels in the Bush administration is driven more by a wish for national energy independence than by concerns about climate, while farmers embrace them largely for profit motives. But science has played a part in condoning this shaky vision. It’s a little late now for some scientists to be explaining that of course the benefits will only be felt with next-generation biofuels, which will make much more efficient use of plant matter.
Biofuels aren’t the only reason for soaring food prices. Population rise is playing its ever baleful part, as is the increase in oil prices, which makes food costlier to produce and transport. This is a less simple equation than is often implied, because growing crops for energy introduces a new economic coupling between oil and food: escalating oil prices make it advantageous for farmers to switch to energy crops. The consequences of this new dependency in two vast sectors of the economy do not yet seem to have been carefully evaluated.
*****
Chinese geoscience blotted its record when its bold claim to be able to predict earthquakes was followed in 1976 by the devastating and unforeseen Tangshan quake that killed several hundred thousand. The death toll of the recent magnitude 7.9 quake in Sichuan province may ultimately approach a tenth of that. The basic laws of mechanics seem to preclude accurate forecasting by monitoring geological faults, no matter how closely, because the size and timing of slippage is inherently unpredictable from information available at the source. But researchers based at Edinburgh think the necessary information could be spread over a far wider area around the fault zone, in the pattern and evolution of stress in surrounding rock. They propose using small human-made seismic waves to map out these stresses, and claim this could enable the time, size and maybe location of earthquakes to be forecast days or even months in advance.
They say a stress-monitoring site consisting of three boreholes1-2 km deep, fitted out with seismic sources and sensors, could have forecast such a big event in Sichuan even from Beijing, 1000 km away. A monitoring station’s likely price tag of several million dollars dwindles before the cost of damage inflicted by quakes this severe. Despite the notorious record of earthquake prediction, this one looks worth a shot.
Friday, May 30, 2008
Thursday, May 29, 2008
Why we should love logarithms
[More Musement from Nature News.]
The tendency of 'uneducated' people to compress the number scale for big numbers is actually an admirable way of measuring the world.
I'd never have guessed, in the days when I used to paw through my grubby book of logarithms in maths classes, that I'd come to look back with fondness on these tables of cryptic decimals. In those days the most basic of electronic calculators was the size of a laptop and about as expensive in real terms, so books of logarithms were the quickest way to multiply large numbers (see 'What are logarithms'.
Of course, logarithms remain central to any advanced study of mathematics. But as they are no longer a practical arithmetic tool, one can’t now assume general familiarity with them. And so, countless popular science books contain potted guides to using exponential notation and interpreting logarithmic axes on graphs. Why do they need to do this? Because logarithmic scaling is the natural system for magnitudes of quantities in the sciences.
That's why a new claim that logarithmic mapping of numbers is the natural, intuitive scheme for humans rings true. Stanislas Dehaene of the Federative Institute of Research in Gif-sur-Yvette, France, and his co-workers report in Science [1] that both adults and children of an Amazonian tribe called the Mundurucu, who have had almost no exposure to the linear counting scale of the industrialized world, judge magnitudes on a logarithmic basis.
Down the line
The researchers presented their subjects with a computerized task in which they were asked to locate on a line the points that best signified the number of various stimuli (dots, sequences of tones or spoken words) in the ranges from 1 to 10 and from 10 to 100. One end of the line corresponded to 1, say, and the other to 10; where on this line should 6 sit? The results showed that the Amazonians had a clear tendency to apportion the divisions logarithmically, which means that successive numbers get progressively closer together as they get bigger.
The same behaviour has previously been seen in young children from the West [2]. But adults instead use a linear scaling, in which the distance between each number is the same irrespective of their magnitude. This could be because adults are taught that is how numbers are 'really' distributed, or it could be that some intrinsic aspect of brain development creates a greater predisposition to linear scaling as we mature. To distinguish between these possibilities, Dehaene and his colleagues tested an adult population that was 'uncontaminated' by schooling.
The implication of their finding, they say, is that "the concept of a linear number line seems to be a cultural invention that fails to develop in the absence of formal education". If this study were done in the nineteenth century (and aside from the computerized methodology, it could just as easily have been), we can feel pretty sure that it would have been accompanied by some patronizing comment about how 'primitive' people have failed to acquire the requisite mathematical sophistication.
Today's anthropology is more enlightened, and indeed Dehaene and his team have previously revealed the impressive subtlety of Mundurucu concepts of number and space, despite the culture having no words for numbers greater than five [3,4].
Everything in perspective
But in any event, the proper conclusion is surely that it is our own intuitive sense of number that is somehow awry. The notion of a decreasing distance between numbers makes perfect sense once we think about that difference in proportionate terms: 1,001 is clearly more akin to 1,000 than 2 is to 1. We can even quantify those degrees of likeness. If we space numbers along a scale such that the distances between them reflect the proportion by which they increment the previous number, then the distance of a number n from 1 is given by the harmonic series, the sum of 1 + 1/2 + 1/3 + 1/4 and so on up to 1/n. This distance is roughly proportional to the logarithm of n.
This, it is often said, is why life seems to speed up as we get older: each passing year is a smaller proportion of our whole life. In perceptual terms, the clock ticks with an ever faster beat.
But wait, you might say – surely 'real' quantities are linear? A kilometre is a kilometre whether we have travelled 1 or 100 already, and it takes us the same time to traverse at constant speed. Well, yes and no. Many creatures, execute random walks or the curious punctuated random walks called Lévy flights [watch out for next week's issue of Nature on this...], in which migrations over a fixed increment in distance takes an ever longer time. Besides, we can usually assume that an animal capable of covering 100 kilometres could manage 101, but not necessarily that one capable of 1 kilometre could manage 2 kilometres (try it with a young child).
Yet the logarithmic character of nature goes deeper than that. For scientists, just about all magnitude scales are most meaningful when expressed logarithmically, a fact memorably demonstrated in the vision of the Universe depicted in the celebrated 1977 film Powers of Ten The femtometre (10**-15 metres) is the scale of the atomic nucleus, the nanometre (10**-9 metres) that of molecular systems, the micrometre (10**-6 metres) the scale of the living cell, and so on. Cosmological eras demand logarithmically fine time divisions as we move closer back towards the Big Bang. The immense variation in the size of earthquakes is tamed by the logarithmic magnitude scale, in which (roughly speaking) an increase of one degree of magnitude corresponds to a tenfold increase in energy. The same is true of the decibel scale for sound intensity, and the pH scale of acidity.
Law of the land
Indeed, the relationship between earthquake magnitude and frequency is one of the best known of the ubiquitous natural power laws, in which some quantity is proportional to the n th power of another. These relationships are best depicted with logarithmic scaling: on logarithmic axes, they look linear. Power laws have been discovered not only for landslides and solar flares but for many aspects of human culture: word-use frequency, say, or size-frequency relationships of wars, towns and website connections.
All these things could be understood much more readily if we could continue to use the logarithmic number scaling with which we are apparently endowed intuitively. So why do we devote so much energy to replacing it with linear scaling?
Linearity betrays an obsession with precision. That might incline us to expect an origin in engineering or surveying, but actually it isn't clear that this is true. The greater the number of units in a structure's dimension, the less that small errors matter: a temple intended to be 100 cubits long could probably accommodate 101 cubits, and in fact often did, because early surveying methods were far from perfect. And in any event, such dimensions were often determined by relative proportions rather than by absolute numbers. It seems more conceivable that a linear mentality stemmed from trade: if you're paying for 100 sheep, you don't want to be given 99, and the seller wants to make sure he doesn't give you 101. And if traders want to balance their books, these exact numbers matter.
Yet logarithmic thinking doesn't go away entirely. Dehaene and his colleagues show that it remains even in Westerners for very large numbers, and it is implicit in the skill of numerical approximation. Counting that uses a base system, such as our base 10, also demands a kind of logarithmic terminology: you need a new word or symbol only for successive powers of ten (as found both in ancient Egypt and China).
All in all, there are good arguments why an ability to think logarithmically is valuable. Does a conventional education perhaps suppress it more than it should?
References
1. Dehaene, S. , Izard, V. , Spelke, E. & Pica, P. Science 320, 1217–1220 (2008).
2. Booth, J. L. & Siegler, R. S. Dev. Psychol. 42, 189–201 (2006).
3. Pica, P. , Lemer, C. , Izard, V. & Dehaene, S. Science 306, 499–503 (2004).
4. Dehaene, S. , Izard, V. , Pica, P. & Spelke, E. Science 311, 381–384 (2006).
[More Musement from Nature News.]
The tendency of 'uneducated' people to compress the number scale for big numbers is actually an admirable way of measuring the world.
I'd never have guessed, in the days when I used to paw through my grubby book of logarithms in maths classes, that I'd come to look back with fondness on these tables of cryptic decimals. In those days the most basic of electronic calculators was the size of a laptop and about as expensive in real terms, so books of logarithms were the quickest way to multiply large numbers (see 'What are logarithms'.
Of course, logarithms remain central to any advanced study of mathematics. But as they are no longer a practical arithmetic tool, one can’t now assume general familiarity with them. And so, countless popular science books contain potted guides to using exponential notation and interpreting logarithmic axes on graphs. Why do they need to do this? Because logarithmic scaling is the natural system for magnitudes of quantities in the sciences.
That's why a new claim that logarithmic mapping of numbers is the natural, intuitive scheme for humans rings true. Stanislas Dehaene of the Federative Institute of Research in Gif-sur-Yvette, France, and his co-workers report in Science [1] that both adults and children of an Amazonian tribe called the Mundurucu, who have had almost no exposure to the linear counting scale of the industrialized world, judge magnitudes on a logarithmic basis.
Down the line
The researchers presented their subjects with a computerized task in which they were asked to locate on a line the points that best signified the number of various stimuli (dots, sequences of tones or spoken words) in the ranges from 1 to 10 and from 10 to 100. One end of the line corresponded to 1, say, and the other to 10; where on this line should 6 sit? The results showed that the Amazonians had a clear tendency to apportion the divisions logarithmically, which means that successive numbers get progressively closer together as they get bigger.
The same behaviour has previously been seen in young children from the West [2]. But adults instead use a linear scaling, in which the distance between each number is the same irrespective of their magnitude. This could be because adults are taught that is how numbers are 'really' distributed, or it could be that some intrinsic aspect of brain development creates a greater predisposition to linear scaling as we mature. To distinguish between these possibilities, Dehaene and his colleagues tested an adult population that was 'uncontaminated' by schooling.
The implication of their finding, they say, is that "the concept of a linear number line seems to be a cultural invention that fails to develop in the absence of formal education". If this study were done in the nineteenth century (and aside from the computerized methodology, it could just as easily have been), we can feel pretty sure that it would have been accompanied by some patronizing comment about how 'primitive' people have failed to acquire the requisite mathematical sophistication.
Today's anthropology is more enlightened, and indeed Dehaene and his team have previously revealed the impressive subtlety of Mundurucu concepts of number and space, despite the culture having no words for numbers greater than five [3,4].
Everything in perspective
But in any event, the proper conclusion is surely that it is our own intuitive sense of number that is somehow awry. The notion of a decreasing distance between numbers makes perfect sense once we think about that difference in proportionate terms: 1,001 is clearly more akin to 1,000 than 2 is to 1. We can even quantify those degrees of likeness. If we space numbers along a scale such that the distances between them reflect the proportion by which they increment the previous number, then the distance of a number n from 1 is given by the harmonic series, the sum of 1 + 1/2 + 1/3 + 1/4 and so on up to 1/n. This distance is roughly proportional to the logarithm of n.
This, it is often said, is why life seems to speed up as we get older: each passing year is a smaller proportion of our whole life. In perceptual terms, the clock ticks with an ever faster beat.
But wait, you might say – surely 'real' quantities are linear? A kilometre is a kilometre whether we have travelled 1 or 100 already, and it takes us the same time to traverse at constant speed. Well, yes and no. Many creatures, execute random walks or the curious punctuated random walks called Lévy flights [watch out for next week's issue of Nature on this...], in which migrations over a fixed increment in distance takes an ever longer time. Besides, we can usually assume that an animal capable of covering 100 kilometres could manage 101, but not necessarily that one capable of 1 kilometre could manage 2 kilometres (try it with a young child).
Yet the logarithmic character of nature goes deeper than that. For scientists, just about all magnitude scales are most meaningful when expressed logarithmically, a fact memorably demonstrated in the vision of the Universe depicted in the celebrated 1977 film Powers of Ten The femtometre (10**-15 metres) is the scale of the atomic nucleus, the nanometre (10**-9 metres) that of molecular systems, the micrometre (10**-6 metres) the scale of the living cell, and so on. Cosmological eras demand logarithmically fine time divisions as we move closer back towards the Big Bang. The immense variation in the size of earthquakes is tamed by the logarithmic magnitude scale, in which (roughly speaking) an increase of one degree of magnitude corresponds to a tenfold increase in energy. The same is true of the decibel scale for sound intensity, and the pH scale of acidity.
Law of the land
Indeed, the relationship between earthquake magnitude and frequency is one of the best known of the ubiquitous natural power laws, in which some quantity is proportional to the n th power of another. These relationships are best depicted with logarithmic scaling: on logarithmic axes, they look linear. Power laws have been discovered not only for landslides and solar flares but for many aspects of human culture: word-use frequency, say, or size-frequency relationships of wars, towns and website connections.
All these things could be understood much more readily if we could continue to use the logarithmic number scaling with which we are apparently endowed intuitively. So why do we devote so much energy to replacing it with linear scaling?
Linearity betrays an obsession with precision. That might incline us to expect an origin in engineering or surveying, but actually it isn't clear that this is true. The greater the number of units in a structure's dimension, the less that small errors matter: a temple intended to be 100 cubits long could probably accommodate 101 cubits, and in fact often did, because early surveying methods were far from perfect. And in any event, such dimensions were often determined by relative proportions rather than by absolute numbers. It seems more conceivable that a linear mentality stemmed from trade: if you're paying for 100 sheep, you don't want to be given 99, and the seller wants to make sure he doesn't give you 101. And if traders want to balance their books, these exact numbers matter.
Yet logarithmic thinking doesn't go away entirely. Dehaene and his colleagues show that it remains even in Westerners for very large numbers, and it is implicit in the skill of numerical approximation. Counting that uses a base system, such as our base 10, also demands a kind of logarithmic terminology: you need a new word or symbol only for successive powers of ten (as found both in ancient Egypt and China).
All in all, there are good arguments why an ability to think logarithmically is valuable. Does a conventional education perhaps suppress it more than it should?
References
1. Dehaene, S. , Izard, V. , Spelke, E. & Pica, P. Science 320, 1217–1220 (2008).
2. Booth, J. L. & Siegler, R. S. Dev. Psychol. 42, 189–201 (2006).
3. Pica, P. , Lemer, C. , Izard, V. & Dehaene, S. Science 306, 499–503 (2004).
4. Dehaene, S. , Izard, V. , Pica, P. & Spelke, E. Science 311, 381–384 (2006).
Making Hay
It’s a rotten cliché of a title, but you can’t avoid the irony when the scene was pretty much like that above – I don’t know if this picture was taken this year or some previous year, but it sums up the situation at the Hay Literary Festival on Sunday and Monday this week. (Looking outside, things may not have got much better.) Strangely, this didn’t matter. One great thing about Hay is that it takes place in a complex of tents connected by covered walkways, so you can stroll around and stay dry even if it is pelting down. It was cold enough to see your breath at midday, but no one seemed to be complaining, and the crowds kept coming. These Guardian readers are hardier folk than you might think.
The mud was another matter. The authors’ car park was a lake, so there was no avoiding a trek through a reconstruction of the Somme. We came with a silver-grey car; now it has the colour and texture, if not quite the smell, of a farmyard. Note to self: take wellies next time.
Still, fun for all. Good food, no plague of cheap commercialism, and a fantastic setting even if you can’t see it through the driving rain. I was there with my family, so had limited opportunity to catch talks, but I was impressed by David King’s passionate determination to get beyond the rearranging-deckchairs approach to climate change. As David is in a position to make things happen, this is good news for us all. In particular, he advocates a massive increase in funding of research to make solar energy affordable enough to be a routine aspect of new building, enforced by legislation. David sometimes gets flak from environmental groups for not going far enough (not to mention his endorsement of nuclear power), but he is far more outspoken and committed than many, if not most, of the scientists in such positions of influence. I must admit that when David was appointed Chief Scientific Adviser, I took the simplistic view that he is a nice chap and a good scientist but that heterogeneous catalysis seemed an awfully long way from policy advising. Eat those words, lad – he’s shown exactly how a scientific adviser can make a real difference on important issues.
He was there primarily to talk about his book on climate change with Gabrielle Walker, The Hot Topic. But he and I, along with Steve Jones, sat on a discussion panel for Radio 4’s Material World, recorded in front of the Hay audience for broadcast on Thursday 29 May (listen out for the rain pelting on canvas). We talked about what happens to science when it intersects with broader culture – yes, vague huh? While my co-panellists are old hands at finding incisive responses to whatever is thrown at them, I sometimes felt that I was mouthing platitudes. No doubt the capable MW team will have edited it down to a model of eloquence.
But my main excuse for lounging in the artists’ luxurious Green Room (i.e. it had heating and a more or less dry carpet) was that I was talking about my book on Chartres cathedral, Universe of Stone – as it turned out, to an improbably large audience in the cinema tent. Perhaps they thought that ‘Universe of Stone’ was a blockbuster movie. (And maybe it should be – good title, no?) Anyway, they were very kind, and made me want to go back.
Wednesday, May 14, 2008
Me, me, me
Well, what do you expect in a blog, after all? Here, then, is some blatant advertising of forthcoming events at which I’m speaking or participating. I’ve been trying to rein back on this kind of thing, but seem to have acquired a cluster of bookings in the near future.
You’ve already missed the seminar on new materials at King’s College, London, on 12 May – a very interesting collection of people assembled by Mark Miodownik, whose Materials Library is a very fabulous thing to behold. I hope to post my talk on my website soon.
On 27 May I am talking about my book Universe of Stone at the Hay Festival. And it seems that I’ll be participating in a discussion about science books for Radio 4’s Material World, which will be recorded at Hay the previous day. The other panellists are Sir David King and Steve Jones.
On 28 May I will be chairing a public discussion on synthetic biology at the Science Museum’s Dana Centre, called ‘Making Life’.
In June I have what is looking ominously like a residency at the Royal Institution, starting with a discussion of my novel The Sun and Moon Corrupted at the newly launched RI book club on 9 June. I will be coming along to face the critics after the discussion – do come and be gentle with me.
The following Monday, 16 June, I will be attempting to persuade the RI audience why human spaceflight is seldom of any scientific worth and is best left to private entrepreneurs (see here). The counter-argument will be ably put by Kevin Fong of UCL.
Then on 10 July I’ll be talking at the RI about my book Elegant Solutions, published by the Royal Society of Chemistry, which looked at the issue of beauty in chemistry experiments (details here). This is an event organized to mark the book’s receipt of the 2007 Dingle Prize for communicating the history of science and technology from the British Society for the History of Science.
Then I’m having a holiday.
Well, what do you expect in a blog, after all? Here, then, is some blatant advertising of forthcoming events at which I’m speaking or participating. I’ve been trying to rein back on this kind of thing, but seem to have acquired a cluster of bookings in the near future.
You’ve already missed the seminar on new materials at King’s College, London, on 12 May – a very interesting collection of people assembled by Mark Miodownik, whose Materials Library is a very fabulous thing to behold. I hope to post my talk on my website soon.
On 27 May I am talking about my book Universe of Stone at the Hay Festival. And it seems that I’ll be participating in a discussion about science books for Radio 4’s Material World, which will be recorded at Hay the previous day. The other panellists are Sir David King and Steve Jones.
On 28 May I will be chairing a public discussion on synthetic biology at the Science Museum’s Dana Centre, called ‘Making Life’.
In June I have what is looking ominously like a residency at the Royal Institution, starting with a discussion of my novel The Sun and Moon Corrupted at the newly launched RI book club on 9 June. I will be coming along to face the critics after the discussion – do come and be gentle with me.
The following Monday, 16 June, I will be attempting to persuade the RI audience why human spaceflight is seldom of any scientific worth and is best left to private entrepreneurs (see here). The counter-argument will be ably put by Kevin Fong of UCL.
Then on 10 July I’ll be talking at the RI about my book Elegant Solutions, published by the Royal Society of Chemistry, which looked at the issue of beauty in chemistry experiments (details here). This is an event organized to mark the book’s receipt of the 2007 Dingle Prize for communicating the history of science and technology from the British Society for the History of Science.
Then I’m having a holiday.
Friday, May 09, 2008
Mixed messages
Last night I drove into the traffic hell that is Canary Wharf to see a play by the marvellous Shifting Sands in a rather nice little theatre marooned on the Isle of Dogs. (I would advertise it, but this is the end of their run. I’m collaborating with Shifting Sands on a production early next year based on the life of Paracelsus and funded by the Wellcome Trust.)
I’ve never ventured into E14 by car before, for good reason. Here is a traffic system that radiates sneering contempt, confronting you with a morass of flyovers, tunnels and slip roads labelled only with signs saying things like ‘Canary Wharf Depot A’. One wrong turn and I was in a tunnel that offered no escape until it spat me back out at the Rotherhithe Tunnel.
My point is this. You emerge, dazed, anxious and disorientated, from some underground cavern to find yourself on a busy roundabout, and in the middle is the structure shown above. Traffic lights point in all directions; some beckon in green, some prohibit in red, some tantalize in amber.
‘You can’t be serious’, I muttered, and several moments passed before I twigged that indeed this is not a serious device for directing traffic, but, can you credit it, an art installation. At least, I could only assume so, but I decided to quiz the bar attendant at the theatre. She professed ignorance of the roads, but a local bloke sitting at the bar chipped in. The installation cost £140,000, he said, and he was living in a flat that overlooked it when it was first installed. ‘I’ve never seen so many accidents’, he said.
The stupidity of it is so breathtaking that it is almost a work of conceptual art itself. I try to picture the local council meeting at which the design was proposed. ‘Yes, I want to use real traffic lights. By utterly confusing and bewildering the driver, you see, it will make a comment on the complexity of everyday life.’ ‘Well, that sounds like a brilliant idea. Here’s 140 grand.’
This is all merely a slender excuse to advertise this nice preprint by Stefan Lämmer and Dirk Helbing on self-organized traffic lights that replace central control with autonomy. A self-organized approach could in principle let traffic flow considerably more efficiently, as I’ve discussed some time ago in a Nature article.
But I fear that E14 is beyond any redemption that self-organization can offer.
Sunday, May 04, 2008
When worlds collide
[This is the pre-edited version of my
latest Muse article for Nature News.]
Worries about an apocalypse unleashed by particle accelerators are not new. They have their source in old myths, which are hard to dispel.
When physicists dismiss as a myth the charge that the Large Hadron Collider (LHC) will trigger a process that might destroy the world, they are closer to the truth that they realise. In common parlance a myth has come to denote a story that isn’t true, but in fact it is a story that is psychologically true. A real myth is not a false story but an archetypal one. And the archetype for this current bout of scare stories is obvious: the Faust myth, in which an hubristic individual unleashes forces he or she cannot control.
The LHC is due to be switched on in July at CERN, the European centre for particle physics near Geneva. But some fear that the energies released by colliding subatomic particles will produce miniature black holes that will engulf the world. Walter Wagner, a resident of Hawaii, has even filed a lawsuit to prevent the experiments.
As high-energy physicist Joseph Kapusta points out in a new preprint [1], such dire forebodings have accompanied the advent of other particle accelerators in the past, including the Bevalac in California and the Relativistic Heavy Ion Collider (RHIC) on Long Island. In the latter case, newspapers seized on the notion of an apocalyptic event – the UK’s Sunday Times ran a story under the headline ‘The final experiment?’
The Bevalac, an amalgamation of two existing accelerators at the Lawrence Berkeley Laboratory, was created in the 1970s to investigate extremely dense states of nuclear matter – stuff made from the compact nuclei of atoms. In 1974 two physicists proposed that there might be a hitherto unseen and ultra-dense form of nuclear matter more stable than ordinary nuclei, which they rather alarmingly dubbed ‘abnormal’. If so, there was a small chance that even the tiniest lump of it could keep growing indefinitely by cannibalizing ordinary matter. Calculations implied that a speck of this pathological form of abnormal nuclear matter made in the Bevalac would sink to the centre of the Earth and then expand to swallow the planet, all in a matter of seconds.
No one, Kapusta says, expected that abnormal nuclear matter, if it existed at all, would really have this voracious character – but neither did anyone know enough about the properties of nuclear matter to rule it out absolutely. According to physicists Subal Das Gupta and Gary Westfall, who wrote about the motivations behind the Bevalac to mark its termination in 1993[2], “Meetings were held behind closed doors to decide whether or not the proposed experiments should be aborted.”
The RHIC, at the Brookhaven National Laboratory, began operating in 1999 primarily to create another predicted superdense form of matter called a quark-gluon plasma. This is thought to have been what the universe consisted of less than a millisecond after the Big Bang. Following an article about it in Scientific American, worries were raised about whether matter this dense might collapse into a mini-black hole that would again then grow to engulf the planet.
Physicist Frank Wilczek dismissed this idea as “incredible”, but at the same time he raised a new possibility: the creation of another super-dense, stable form of matter called a strangelet that could again be regarded as a potential Earth-eater. In a scholarly article published in 2000, Wilczek and several coworkers analysed all the putative risks posed by the RHIC, and concluded that none posed the slightest real danger[3].
But isn’t this just what we’d expect high-energy physicists to say? That objection was raised by Richard Posner, a distinguished professor of law at the University of Chicago[4]. He argued that scientific experiments that pose potentially catastrophic risks, however small, should be reviewed in advance by an independent board. He recognized that current legal training provides lawyers and judges with no expertise for making assessments about scientific phenomena “of which ordinary people have no intuitive sense whatsoever”, and asserted that such preparation is therefore urgently needed.
It seems reasonable to insist that, at the very least, such research projects commission their own expert assessment of risks, as is routinely done in some areas of bioscience. The LHC has followed the example of the RHC in doing just that. A committee has examined the dangers posed by strangelets, black holes, and the effects of possible ‘hidden’ extra dimensions of space. In 2003 they declared that “we find no basis for any conceivable threat” from the accelerator’s high-energy collisions[5].
These scare stories are not unique to particle physics. When in the late 1960s Soviet scientists mistakenly believed they had found a new, waxy form of pure water called polywater, one scientist suggested that it could ‘seed’ the conversion of all the world’s oceans to gloop – a scenario memorably anticipated in Kurt Vonnegut’s 1963 novel Cat’s Cradle, where the culprit was instead a new form of ice. Superviruses leaked from research laboratories are a favourite source of rumour and fear – this was one suggestion for the origin of AIDS. And nanotechnology was accused of hastening doomsday thanks to one commentator’s fanciful vision of grey goo: replicating nanoscale robots that disassemble the world for raw materials from which to make copies of themselves.
In part, the appeal of these stories is simply the frisson of an eschatological tale, the currency of endless disaster movies. But it’s also significant that these are human-made apocalypses, triggered by the heedless quest for knowledge about the universe.
This is the template that became attached to the Faust legend. Initially a folk tale about an itinerant charlatan with roots that stretch back to the Bible, the Faust story was later blended with the myth of Prometheus, who paid a harsh price for daring to challenge the gods because of his thirst for knowledge. Goethe’s Faust embodied this fusion, and Mary Shelley popularized it in Frankenstein, which she explicitly subtitled ‘The Modern Prometheus’. Roslynn Haynes, a professor of English literature, has explored how the Faust myth shaped a common view of the scientist as an arrogant seeker of dangerous and powerful knowledge[6].
All this sometimes leaves scientists weary of the distrust they engender, but Kapusta points out that it is occasionally even worse than that. When Das Gupta and Westfall wrote about the concerns of abnormal nuclear matter raised with the Bevalac, they were placed on the FBI’s ‘at risk’ list of individuals thought to be potential targets of the Unabomber. Between 1978 and 1995, this former mathematician living in a forest shack in Montana sent bombs through the US mail to scientists and engineers he considered to be working on harmful technologies. A lawsuit by a disgruntled Hawaiian seems mild by comparison.
And yet… might there be anything in these fears? During the Manhattan Project that developed the atomic bomb, several of the scientists involved were a little unsure, until they saw the mushroom cloud of the Trinity test, whether the explosion might not trigger runaway combustion of the Earth’s atmosphere.
The RHIC and LHC have taken far less on trust. But of course the mere acknowledgement of the risks that is implied by commissioning studies to quantify them, along with the fact that it is rarely possible to assign any such risk a strictly zero probability, must itself fuel public concern. And it is well known to risk-perception experts that we lack the ability to make a proper rating of very rare but very extreme disasters, even to the simple extent that we feel mistakenly safer in our cars than in an aeroplane.
That’s why Kapusta’s conclusion that “physicists must learn how to communicate their exciting discoveries to nonscientists honestly and seriously”, commendable though it is, can never provide a complete answer. We need to recognize that these fears have a mythic dimension that rational argument can never wholly dispel.
References
1. Kapusta, J. I. Preprint http://xxx.arxiv.org/abs/0804.4806
2. Das Gupta, S. & Wetfall, G. D. Physics Today 46 (May 1993), 34-40.
3. Jaffe, R. L. et al., Rev. Mod. Phys. 72, 1125-1140 (2000).
4. Posner, R. A. Catastrophe: Risk and Response (Oxford University Press, Oxford, 2004).
5. Blaizot, J.-P. et al., ‘Study of potentially dangerous events during heavy-ion collisions at the LHC: Report of the LHC Safety Study Group’, CERN Report 2003-001.
6. Haynes R.D., From Faust to Strangelove: Representations of the Scientist in Western Literature (Johns Hopkins University Press, Baltimore & London, 1994).
[This is the pre-edited version of my
latest Muse article for Nature News.]
Worries about an apocalypse unleashed by particle accelerators are not new. They have their source in old myths, which are hard to dispel.
When physicists dismiss as a myth the charge that the Large Hadron Collider (LHC) will trigger a process that might destroy the world, they are closer to the truth that they realise. In common parlance a myth has come to denote a story that isn’t true, but in fact it is a story that is psychologically true. A real myth is not a false story but an archetypal one. And the archetype for this current bout of scare stories is obvious: the Faust myth, in which an hubristic individual unleashes forces he or she cannot control.
The LHC is due to be switched on in July at CERN, the European centre for particle physics near Geneva. But some fear that the energies released by colliding subatomic particles will produce miniature black holes that will engulf the world. Walter Wagner, a resident of Hawaii, has even filed a lawsuit to prevent the experiments.
As high-energy physicist Joseph Kapusta points out in a new preprint [1], such dire forebodings have accompanied the advent of other particle accelerators in the past, including the Bevalac in California and the Relativistic Heavy Ion Collider (RHIC) on Long Island. In the latter case, newspapers seized on the notion of an apocalyptic event – the UK’s Sunday Times ran a story under the headline ‘The final experiment?’
The Bevalac, an amalgamation of two existing accelerators at the Lawrence Berkeley Laboratory, was created in the 1970s to investigate extremely dense states of nuclear matter – stuff made from the compact nuclei of atoms. In 1974 two physicists proposed that there might be a hitherto unseen and ultra-dense form of nuclear matter more stable than ordinary nuclei, which they rather alarmingly dubbed ‘abnormal’. If so, there was a small chance that even the tiniest lump of it could keep growing indefinitely by cannibalizing ordinary matter. Calculations implied that a speck of this pathological form of abnormal nuclear matter made in the Bevalac would sink to the centre of the Earth and then expand to swallow the planet, all in a matter of seconds.
No one, Kapusta says, expected that abnormal nuclear matter, if it existed at all, would really have this voracious character – but neither did anyone know enough about the properties of nuclear matter to rule it out absolutely. According to physicists Subal Das Gupta and Gary Westfall, who wrote about the motivations behind the Bevalac to mark its termination in 1993[2], “Meetings were held behind closed doors to decide whether or not the proposed experiments should be aborted.”
The RHIC, at the Brookhaven National Laboratory, began operating in 1999 primarily to create another predicted superdense form of matter called a quark-gluon plasma. This is thought to have been what the universe consisted of less than a millisecond after the Big Bang. Following an article about it in Scientific American, worries were raised about whether matter this dense might collapse into a mini-black hole that would again then grow to engulf the planet.
Physicist Frank Wilczek dismissed this idea as “incredible”, but at the same time he raised a new possibility: the creation of another super-dense, stable form of matter called a strangelet that could again be regarded as a potential Earth-eater. In a scholarly article published in 2000, Wilczek and several coworkers analysed all the putative risks posed by the RHIC, and concluded that none posed the slightest real danger[3].
But isn’t this just what we’d expect high-energy physicists to say? That objection was raised by Richard Posner, a distinguished professor of law at the University of Chicago[4]. He argued that scientific experiments that pose potentially catastrophic risks, however small, should be reviewed in advance by an independent board. He recognized that current legal training provides lawyers and judges with no expertise for making assessments about scientific phenomena “of which ordinary people have no intuitive sense whatsoever”, and asserted that such preparation is therefore urgently needed.
It seems reasonable to insist that, at the very least, such research projects commission their own expert assessment of risks, as is routinely done in some areas of bioscience. The LHC has followed the example of the RHC in doing just that. A committee has examined the dangers posed by strangelets, black holes, and the effects of possible ‘hidden’ extra dimensions of space. In 2003 they declared that “we find no basis for any conceivable threat” from the accelerator’s high-energy collisions[5].
These scare stories are not unique to particle physics. When in the late 1960s Soviet scientists mistakenly believed they had found a new, waxy form of pure water called polywater, one scientist suggested that it could ‘seed’ the conversion of all the world’s oceans to gloop – a scenario memorably anticipated in Kurt Vonnegut’s 1963 novel Cat’s Cradle, where the culprit was instead a new form of ice. Superviruses leaked from research laboratories are a favourite source of rumour and fear – this was one suggestion for the origin of AIDS. And nanotechnology was accused of hastening doomsday thanks to one commentator’s fanciful vision of grey goo: replicating nanoscale robots that disassemble the world for raw materials from which to make copies of themselves.
In part, the appeal of these stories is simply the frisson of an eschatological tale, the currency of endless disaster movies. But it’s also significant that these are human-made apocalypses, triggered by the heedless quest for knowledge about the universe.
This is the template that became attached to the Faust legend. Initially a folk tale about an itinerant charlatan with roots that stretch back to the Bible, the Faust story was later blended with the myth of Prometheus, who paid a harsh price for daring to challenge the gods because of his thirst for knowledge. Goethe’s Faust embodied this fusion, and Mary Shelley popularized it in Frankenstein, which she explicitly subtitled ‘The Modern Prometheus’. Roslynn Haynes, a professor of English literature, has explored how the Faust myth shaped a common view of the scientist as an arrogant seeker of dangerous and powerful knowledge[6].
All this sometimes leaves scientists weary of the distrust they engender, but Kapusta points out that it is occasionally even worse than that. When Das Gupta and Westfall wrote about the concerns of abnormal nuclear matter raised with the Bevalac, they were placed on the FBI’s ‘at risk’ list of individuals thought to be potential targets of the Unabomber. Between 1978 and 1995, this former mathematician living in a forest shack in Montana sent bombs through the US mail to scientists and engineers he considered to be working on harmful technologies. A lawsuit by a disgruntled Hawaiian seems mild by comparison.
And yet… might there be anything in these fears? During the Manhattan Project that developed the atomic bomb, several of the scientists involved were a little unsure, until they saw the mushroom cloud of the Trinity test, whether the explosion might not trigger runaway combustion of the Earth’s atmosphere.
The RHIC and LHC have taken far less on trust. But of course the mere acknowledgement of the risks that is implied by commissioning studies to quantify them, along with the fact that it is rarely possible to assign any such risk a strictly zero probability, must itself fuel public concern. And it is well known to risk-perception experts that we lack the ability to make a proper rating of very rare but very extreme disasters, even to the simple extent that we feel mistakenly safer in our cars than in an aeroplane.
That’s why Kapusta’s conclusion that “physicists must learn how to communicate their exciting discoveries to nonscientists honestly and seriously”, commendable though it is, can never provide a complete answer. We need to recognize that these fears have a mythic dimension that rational argument can never wholly dispel.
References
1. Kapusta, J. I. Preprint http://xxx.arxiv.org/abs/0804.4806
2. Das Gupta, S. & Wetfall, G. D. Physics Today 46 (May 1993), 34-40.
3. Jaffe, R. L. et al., Rev. Mod. Phys. 72, 1125-1140 (2000).
4. Posner, R. A. Catastrophe: Risk and Response (Oxford University Press, Oxford, 2004).
5. Blaizot, J.-P. et al., ‘Study of potentially dangerous events during heavy-ion collisions at the LHC: Report of the LHC Safety Study Group’, CERN Report 2003-001.
6. Haynes R.D., From Faust to Strangelove: Representations of the Scientist in Western Literature (Johns Hopkins University Press, Baltimore & London, 1994).
Friday, May 02, 2008
Talking about Chartres
There’s a gallery of images and a vodcast for my book Universe of Stone now up on the Bodley Head site: you can find it here.
There’s a gallery of images and a vodcast for my book Universe of Stone now up on the Bodley Head site: you can find it here.