Fuelling the sceptics?
[Here’s the long version of my Lab Report column in the June issue of Prospect.]
Has the Intergovernmental Panel on Climate Change (IPCC) got its numbers wrong? That’s what a recent paper in Nature seems to be saying, to the delight of climate sceptics everywhere. Whereas the IPCC report forecast a rise in global mean temperature of around 0.2-0.3 oC per decade, researchers in Germany found from a sophisticated computer model of climate that temperatures are likely to remain flat until around 2015, as they have done since about 1998.
The sceptics will argue that this shows we don’t have much of a clue about climate, and all the dire forecasts from models count for nothing. That, however, would be like saying that, because we took a wrong turn on the road from London to Edinburgh, we have no idea where Edinburgh is.
There is actually nothing in the new result that conflicts with the IPCC’s position, which has always acknowledged that the poorly understood natural variability of the climate system will superimpose its imprint on the global warming trend. The new findings are an attempt to forecast short-term, decade-scale temperature changes, rather than the longer-term changes usually considered by climate modellers. Over a decade or two, temperatures are much more susceptible to natural variations (which boosted warming in the late 1990s). The current cooling influence is due to weakening of heat-bearing ocean currents such as the Gulf Stream. This may persist for about a decade, but then the warming will resume, and by 2030 it should reconnect with the IPCC predictions.
No reason, then, to throw out all the climate models. Indeed, no climate scientist seems particularly surprised or perturbed by the findings, which simply flesh out the short-term picture. To climate sceptics, this is mere dissembling and backpedalling, although their own determination to undermine the IPCC’s predicted trend never identified anything of the sort. It’s a curious logic that uses climate modelling to discredit climate models.
Science policy maven Roger Pielke Jr of the University of Colorado, a sceptic in the proper sense, has justifiably demanded how the models can be validated when they are seemingly predicting one thing one moment, and the opposite the next. But the answer is that natural variability compromises any short-term predictions – a frustrating fact of life that demands great care in framing the right questions and drawing conclusions. Certainly, we should remain wary of claims that a few hot summers, or a few more hurricanes, prove that global catastrophe is imminent, just as we should of suggestions that a few relatively cool years rubbish the IPCC’s forecasts.
We must be wary too of making global warming a deus ex machina that explains every environmental trend, especially if it’s bad. Droughts and storms worsened by climate change may be playing a small part in the global food crisis, but a far bigger problem comes from attempts to mitigate such change with biofuels. In 2006, a fifth of US maize was grown to make ethanol, not food. With the US providing 70 percent of global maize exports, grain prices worldwide were sure to feel the effect.
The rush towards an ill-considered biofuels market is a depressing reminder that the vicissitudes of climate science are as nothing compared with the lack of foresight in the economic system that rides on it. The passion for biofuels in the Bush administration is driven more by a wish for national energy independence than by concerns about climate, while farmers embrace them largely for profit motives. But science has played a part in condoning this shaky vision. It’s a little late now for some scientists to be explaining that of course the benefits will only be felt with next-generation biofuels, which will make much more efficient use of plant matter.
Biofuels aren’t the only reason for soaring food prices. Population rise is playing its ever baleful part, as is the increase in oil prices, which makes food costlier to produce and transport. This is a less simple equation than is often implied, because growing crops for energy introduces a new economic coupling between oil and food: escalating oil prices make it advantageous for farmers to switch to energy crops. The consequences of this new dependency in two vast sectors of the economy do not yet seem to have been carefully evaluated.
Chinese geoscience blotted its record when its bold claim to be able to predict earthquakes was followed in 1976 by the devastating and unforeseen Tangshan quake that killed several hundred thousand. The death toll of the recent magnitude 7.9 quake in Sichuan province may ultimately approach a tenth of that. The basic laws of mechanics seem to preclude accurate forecasting by monitoring geological faults, no matter how closely, because the size and timing of slippage is inherently unpredictable from information available at the source. But researchers based at Edinburgh think the necessary information could be spread over a far wider area around the fault zone, in the pattern and evolution of stress in surrounding rock. They propose using small human-made seismic waves to map out these stresses, and claim this could enable the time, size and maybe location of earthquakes to be forecast days or even months in advance.
They say a stress-monitoring site consisting of three boreholes1-2 km deep, fitted out with seismic sources and sensors, could have forecast such a big event in Sichuan even from Beijing, 1000 km away. A monitoring station’s likely price tag of several million dollars dwindles before the cost of damage inflicted by quakes this severe. Despite the notorious record of earthquake prediction, this one looks worth a shot.