With the fallout from the MMR scare still with us, this programme on BBC Radio 4 is a timely reminder of the issues. “Science betrayed” indeed, but by whom? The full story is societal as much as it is biomedical. Anyway, listen while you still can.
Friday, March 25, 2011
I have reviewed the National Theatre’s production of Frankenstein in the latest issue of Nature. Worth seeing (though if you haven’t got a ticket already, you don’t stand much chance), but I was slightly disappointed in the end, having seen some glowing reviews. There’s another perspective here.
Mary Shelley’s Frankenstein has been endlessly adapted and reinterpreted since it was first published, anonymously, in 1818. Aside from the iconic screen version by James Whale in 1931, there have been sequels, parodies (Mel Brooks’ Young Frankenstein, The Rocky Horror Picture Show), and postmodern interpolations (Brain Aldiss’s Frankenstein Unbound). Victor Frankenstein has become the archetypal mad scientist, unleashing powers he cannot control – in one recent remake, he became a female biologist experimenting on organ regeneration with stems cells. The ‘Franken’ label is attached to every new technology that appears to intervene in life, from genetic modification of crops to Craig Venter’s ‘synthetic’ microbe.
This reinvention is no recent phenomenon. Shelley’s book was little known until the first stage adaptations began in the 1820s, in which Frankenstein’s creature was already transformed into a mute, shambling brute based on the stock theatrical character of the Wild Man. This personification continued in the first film adaptation in 1910, simply called Frankenstein.
Some might lament how the original novel has been distorted and vulgarized. But literary critic Chris Baldick has a wiser perspective:
The truth of a myth… is not to be established by authorizing its earliest versions, but by considering all its versions… That series of adaptations allusions, accretions, analogues, parodies and plain misreadings which follows up on Mary Shelley’s novel is not just a supplementary component of the myth; it is the myth.
After all, there isn’t even a definitive version of Shelley’s story. She made small but significant changes in the third edition (1831), in particular emphasizing the Faustian themes of presumption and retribution on which the early stage versions insisted.
Besides, critics still dispute what Shelley’s message was meant to be – probably she was not fully conscious of all the themes herself. Far from offering a simplistic critique of scientific hubris, the story might instead echo Shelley’s troubled family life. Her mother, the feminist and political radical Mary Wollstonecraft, died from complications after Mary’s birth, and her father William Godwin all but disowned her after she eloped to Europe with Percy Shelley in 1814. She lost her first child, named William, that year, subsequently describing a dream in which the boy was reanimated. There is ample reason to believe Percy Shelley’s statement of the central moral of Frankenstein: ‘Treat a person ill, and he becomes wicked’.
If so, Nick Dear’s adaptation of the story for the National Theatre in London, directed by Danny Boyle of Trainspotting and Slumdog Millionaire fame, has returned to the essence of the tale. For it focuses on the plight of the creature, whose lone and awkward ‘birth’ begins the play. We see how this mumbling wretch, spurned as a hideous thing by Victor, is reviled by society until finding refuge with the blind peasant De Lacey. The kindly old man teaches the creature how to speak and read using Milton’s Paradise Lost, the story of Satan’s Promethean challenge to heaven.
Eventually De Lacey’s son and daughter-in-law return from the fields and drive out the creature in horror, whereupon he burns them in their cottage. These scenes are the moral core of Shelley’s novel, and in placing them so early Dear signals that this is very much the monster’s show.
In fact, perhaps too much. For while the creature is the most fully realised, most sympathetic and inventive incarnation I have seen, Victor Frankenstein is left with little to do but recoil from him and neglect all his other duties, martial, filial and moral. It is very clear from the outset who is the real monster.
In this production the two lead actors – Benedict Cumberbatch and Jonny Lee Miller – alternate the roles of Victor and his creature. This Doppelgänger theme is not a new idea: in the stage adaptation by Peggy Webling that formed the basis of Whale’s movie, the creature appeared dressed like Victor (there renamed Henry), who foreshadows the later confusion of creator and creature by saying ‘I call him by my own name – he is Frankenstein.’ It motivates Dear’s decision to leave the duo locked in mutual torment at the end: a vision more true to their relationship than that of the novel itself.
The scientific elements of the tale are skated over. Mary Shelley provided just enough hints for the informed reader to make the connection with Luigi Galvani’s recent work on electrophysiology; Dear has Frankenstein mention galvanism and electrochemistry (somewhat anachronistically), but that is as far as it goes. There is no serious attempt, therefore, to make the play a comment on the ‘Promethean ambitions’ of modern science (as Pope John Paul II called them in 2002) – a relief not because modern science is unblemished but because the alchemical trope of a solitary experimenter exceeding the bounds of God and nature is no longer the relevant vehicle for a critique.
The staging of this production is spectacular, and intelligent choices were made in the structure (if not always in the dialogue). Miller was extraordinary as the creature on the night I saw it; by all accounts Cumberbatch is equally so. Whether Dear adds anything new to the legend – as Whale and even Mel Brooks did – is debatable. But it is well to be reminded that the novel may be read not so much as a Gothic tale of monstrosity and presumption but as a comment on the consequences of how we treat one another.
Wednesday, March 23, 2011
Here’s a little news story I wrote for Nature on the Abel Prize. This award presents a notoriously challenging subject for science reporters each year, because it is always the devil of a job concisely to explain what on earth the recipient has done to deserve the award. I can’t deny that the same challenge applied here, but in spades, because Milnor has done so much. But it was a challenge I enjoyed. Given the choice, I’d have personally kept in the edited version the fact that holomorphic dynamics involves numbers in the complex plane, because it is the kind of thing experts will sniffily point out. But I can understand the fear that the reader will be exhausted by then. Ah, mathematics – what a wonderful, strange game it is.
John Milnor wins the ‘Nobel of maths’ for his manifold works.
Awarding Albert Einstein a Nobel prize for his research on the photoelectric effect looks in retrospect like a somewhat arbitrary choice from among the galaxy of his contributions to all of physics.
In granting the 2011 Abel Prize in mathematics to John Milnor of Stony Brook University in New York, the committee of the Norwegian Academy of Science and Letters has wisely abandoned any such attempt to single out a particular achievement. The citation states merely that Milnor has made ‘pioneering discoveries in topology, geometry and algebra’: in effect a recognition that he has contributed to modern maths across the board.
In fact, Milnor’s work goes further: it also touches on dynamical systems, game theory, group theory and number theory. In awarding this equivalent of a Nobel prize, worth around $1m, the committee states that “All of Milnor’s works display marks of great research: profound insights, vivid imagination, elements of surprise, and supreme beauty.”
His breadth is unusual, says Professor Ragni Piene of the University of Oslo, the chair of the Abel Prize committee. “Though some of the fields he has worked in are related, he really has had to learn and develop new tools and new theory.”
Milnor “says is mainly a problem solver”, adds Piene. “But in the solving process, in order to understand the problem deeply he ends up creating new theories and opening up new fields.”
Among the most surprising of Milnor’s discoveries was the existence of so-called exotic spheres, multidimensional objects with strange topological properties. In 1956 Milnor was studying the topological transformations of smooth-contoured high-dimensional shapes – that is, shapes with no sharp edges. A so-called continuous topological transformation converts one object smoothly – as though remoulding soft clay – into another, without any tears in the fabric.
He discovered that in seven-dimensions there exist smooth objects that can be converted into the 7D equivalent of spheres only via intermediates that do have sharp kinks. In other words, the only way to get from one of these smooth objects to another is by making them not smooth. Kinks and corners in a surface are said to make it non-differentiable, which means that its curvature at the kinks has no well-defined value.
These counter-intuitive exotic spheres can exist in other dimensions too. With the French mathematician Michel Kervaire, Milnor calculated that there are precisely 28 exotic spheres in seven dimensions. But there seems at first glance little rhyme or reason to the trend for other dimensions: there is just one exotic sphere in 1, 2, 3, 5 and 6 dimensions, but 992 in 11 dimensions, 1 in 12 dimensions, 16,256 in 15D, and 2 in 16D. No one has yet figured out how many there are in four dimensions. This work spawned an entire new field of mathematics, called differential topology.
Some of Milnor’s other achievements are recognizably related to such topological conundrums, such as his work on the relationships between different triangulations (representations as networks of triangles) of mathematical surfaces called manifolds. Topology was also central to some of Milnor’s earliest work in 1950 on the curvature of knots.
But his work on group theory is quite different. Group theory was partly invented by the nineteenth-century Norwegian mathematician Niels Henrik Abel, after whom the award is named. In the formulation developed by Abel, a group can be represented as all non-equivalent combinations (‘words’) of a set of symbols. Milnor and the Czech mathematician Frantisek Wolf clarified how the number of words grows as the number of symbols increases for a wide class of groups called solvable groups.
More recently, Milnor, now 80, has been working in the field of holomorphic dynamics, which concerns the trajectories generated in the plane of real and imaginary numbers by iterating equations: the branch of maths that led to the discovery of fractal patterns such as the Mandelbrot and Julia sets.
Milnor has already won just about every other key prize in mathematics, including the Fields medal (1962) and the Wolf prize (1989). But beyond his skills as a researcher, Milnor has been widely praised as a communicator. His books “have become legendary for their high quality”, according to mathematician Timothy Gowers of the University of Cambridge.
Friday, March 18, 2011
Here’s my latest news story for Nature. Eduardo Miranda is working very much at the experimental edge of electronic music – what I’ve heard has an intriguing ethereal quality which grows on you (well, it did on me).
A pianist plays a series of notes, and the woman echoes them on a computerized music system. And she plays a simple improvised melody over a looped backing track. It doesn’t sound much of a musical challenge – except that the woman, a stroke victim, is paralysed except for eye, facial and slight head movements. She is making the music purely by thinking.
This is a trial of a computer-music system that interfaces directly with the user’s brain, via electrodes on the scalp that pick up the tiny electrical impulses of neurons. The device, developed by composer and computer-music specialist Eduardo Miranda of the University of Plymouth in England and computer scientists at the University of Essex, should eventually enable people with severe physical disabilities, caused for example by brain or spinal-cord injuries, to make music for recreation or therapeutic purposes.
“This is surely an interesting avenue, and might be very useful for patients”, says Rainer Goebel, a neuroscientist at the University of Maastricht in the Netherlands who works on brain-computer interfacing.
Quite aside from the pleasure that making music offers, its value in therapy – for example, its capacity to awaken atrophied mental and physical functions in neurodegenerative disease – is well attested. But people who have almost no muscle movement at all have generally been excluded from such benefits and can enjoy music only through passive listening.
The development of brain-computer interfaces (BCIs) that can enable users to control computer functions by mind alone offer new possibilities for such people. In general these interfaces rely on the user’s ability to learn how to self-induce particular mental states that can be detected by brain-scanning technologies.
Miranda and colleagues have used one of the oldest of these techniques: electroencephalography (EEG), in which electrodes on the skull pick up faint neural signals. The EEG signal can be processed quickly, allowing fast response times. The instrumentation is cheap and portable in comparison to brain-scanning techniques such as magnetic resonance imaging (MRI) and positron-emission tomography (PET), and operating it requires no expert knowledge.
Whereas previous efforts on BCIs have tended to focus on simple tasks such as moving cursors or other screen icons, Miranda’s team sought to achieve something much more complex: to enable the user to play and compose music.
Miranda says he became aware of the then-emerging field of BCIs over a decade ago while researching how to make music using brainwaves. “When I realized the potential of a musical BCI for the well-being of severely disabled people”, he says, “I couldn’t leave the idea alone. Now I can’t separate this work from my activities as a composer – they are very integrated.”
The trick is to teach the user how to associate particular brain signals with specific tasks by presenting a repeating stimulus – auditory, visual or tactile, say – and getting the user to focus on it. This elicits a distinctive, detectable pattern in the EEG signal. Miranda and colleagues show several flashing ‘buttons’ on a computer screen, each one triggering a musical event. The users ‘push’ a button just by directing their attention to it.
For example, a button might be used to generate a melody from a pre-selected set of notes. The intensity of the control signal – how ‘hard’ the button is pressed, if you like – can be altered by the user by varying the intensity of attention, and the result is fed back to them visually as a change in the button’s size. In this way, any one of several notes can be selected by mentally altering the intensity of ‘pressing’.
With a little practice, this allows users to create a melody just as if they were selecting keys on a piano. And as with learning an instrument, say the researchers, “the more one practices the better one becomes.” They describe it in a forthcoming paper in the journal Music and Medicine .
The researchers trialled their system with a female patient at the Royal Hospital for Neuro-disability in London, who is suffering from locked-in syndrome, a form of almost total paralysis caused by brain lesions. During a two-hour session, she got the hang of the system and was eventually playing along with a backing track. She reported that “it was great to be in control again.”
Goebel points out that the patients here still need to be able to control their gaze, which people suffering from total locked-in syndrome cannot. In such partial cases, he says, “one can usually use gaze directly for controlling devices, instead of an EEG system”. But Miranda points out that eye-gazing alone does not permit variations in the intensity of the signal. “Eye gazing is comparable to a mouse or joystick”, he says. “Our system adds another dimension, which is the intensity of the choice. That’s crucial for our musical system.”
Miranda says that, while increasing the complexity of the musical tasks is not a priority, music therapists have suggested it would be better if the system was more like a musical instrument – for instance, with an interface that looks like a piano keyboard. He admits that it’s not easy to increase the number of buttons or keys beyond four, but is confident that “we will get there eventually”.
“The flashing thing does not need to be on a computer screen”, he adds. It could, for example, be a physical electronic keyboard with LEDs on the keys. “You could play it by staring at the keys”, he says.
1. Miranda, E. R., Magee, W. L., Wilson, J., Eaton, J. & Palaniappan, R. Music and Medicine (published online), doi:10.1177/1943862111399290.
Thursday, March 17, 2011
Attending a planning meeting for the forthcoming Elements event at the Wellcome Institute on 8 April prompts me to advertise it. I can confidently say with no false modesty that I am among the very least of the attractions (I’ll be speaking, briefly, about mercury and arsenic in pigments). The artists’ suppliers Cornelissen will be bringing beautiful jarloads of the stuff. Nick Lane will be talking about oxygen, Andy Meharg about arsenic, and Andrew Szydlo will be demonstrating how Cornelius Drebbel made his submarine. Andrea Sella will be presenting his mercury show. There will be an oxygen bar where you can inhale the stuff, and an iodine ‘wet play’ area. And you can apparently be cured of syphilis with an inhalation of mercury. Sort of. Andrea has arranged it with Hugh Aldersey-Williams, whose book on the elements has acted as a catalyst for the event. It will be mad and fun (and free!), and like real chemistry should be if it was allowed. If you live in or near London, get there early!
Monday, March 14, 2011
I have an Opinion piece in Saturday’s Times prompted by research being pursued by the Newcastle embryology team. No point in giving the link, as it is subscription-only. But here is the piece before editing.
The idea of having three parents – a notion apparently raised by the latest developments in reproductive technology – seems ripe material both for stand-up routines and for eliciting tabloid postures of horror. Never mind that both would conveniently ignore the fact that some children already have three primary carers in parental roles – it is revealing that this work should be discussed in terms of ‘three parents’ at all.
The more careful reports of the research being considered at Newcastle University – which would create embryos with DNA that is not wholly from the mother and father – will stress that these are three genetic or biological parents. The third ‘parent’ is an egg donor. The egg will be stripped of its nucleus, where the chromosomes reside, and replaced with that from a normal IVF embryo, containing maternal and paternal genes. But the egg will retain a few donor genes – 37 to be precise – in energy-generating compartments outside the cell nucleus called mitochondria. The procedure is being considered to eliminate serious diseases caused by faulty mitochondria in the mother’s eggs.
As a result, all the genes that influence the child’s development would be those of the mother and father except for the handful of genes that operate ‘out of sight’ to drive the mitochondria. Strictly speaking this does make the egg donor a kind of genetic parent, but the better analogy is with transplant patients, who, rather than having a tiny bit of ‘foreign’ genetic material in every tissue, have entirely foreign genes in one particular tissue.
So any talk of a ‘third parent’ here plays up the alleged weirdness of the situation, not least by introducing an exciting whiff of sexual irregularity. And in making the status of parent reassuringly one of genetic entitlement rather than responsibility of care, it plays along with the prevailing current notion of ‘genes’R’us’. Objections such as that raised by a spokesperson from the charity Life that the work would raise questions as to who is the real mother seem to place an extraordinary burden of identity on those 37 genes.
We are here reaping the harvest of modern genetics, especially the projects to read the chemical ‘code’ of human genomes. In their determination to sell this undoubtedly valuable enterprise, genetic scientists have all too often opted for the easy route of presenting the genome as the ‘book of life’, or as one leading scientist put it, ‘the set of instructions to make a human being’.
Such claims have encouraged us to equate our being with our genes to an unsupportable degree. The fact is that genes may be silenced, ignored or modified by environmental factors encountered by the developing organism. And any ‘identical’ twin (a revealing term in itself) will tell you that personal identity is not the same as genetic endowment. But the myth of genetic determinism is falsified beyond even these things. As physiologist Denis Noble has elegantly argued, genes are ‘instructions’ in roughly the same way that Bach’s scores, free of dynamics and ornaments, are prescriptions for music that brings tears to the eyes. (Even that is too tight an analogy, unless the performance admits improvisation.) Genes work not because they specify everything but precisely because there is so much that they do not need to specify.
If we had a better understanding of genetics – I don’t mean the public, but scientists too – we would be less likely to indulge in a gene-based materialistic view of parenthood and identity, and to confuse our bodies with our genomes. The ‘yuk factor’ response to embryos with non-maternal mitochondrial genes is a form of genetic narcissism. After all, most of the cell in our bodies are non-human: they are symbiotic bacteria in our gut, busy performing a host of functions on which our well-being depends. In any case, our genomes are patchworks of genes that no one can meaningfully claim as ‘their own’. In genetic terms we are all Frankenstein’s creatures. Whether we have a single parent or three, we just have to hope they do a better job than Victor did.
Behind all of this, however, is the deeper current of attitudes to ‘unnatural’ interventions in procreation. Here we swim in the murky waters of myth. When scientists in 2009 announced in Nature that they had achieved these ‘ mitochondrial transplants’ in monkeys, an editorial acknowledged that “[an] argument raised when such research has been attempted in the past is that such a three-parent union is ‘unnatural’.” One obvious rejoinder is to point to the neonatal infant mortality rate two centuries ago when birth involved very little intervention and was therefore more ‘natural’. But the function of the word ‘unnatural’ here is not merely to point out that these things don’t happen in nature, but to enlist moral disapproval. The unnatural act is not just the opposite of the natural, but is one we are invited to deplore.
Even if we substitute instead the word ‘artificial’, the pejorative implication remains. This is an ancient prejudice. The distinctions and relative merits of ‘art’ (meaning artifice) and ‘nature’ were debated by Plato and Aristotle, and it was not until the seventeenth century that there was any serious challenge to the prevailing view that artificial objects cannot be equal to natural ones. Often this prejudice went beyond the assertion that the products of technology are inferior: there was a suggestion that technology is inherently perverting. The biologist J. B. S. Haldane put it this way in 1924: “If every physical and chemical invention is a blasphemy, every biological invention is a perversion. There is hardly one which, on first being brought to the notice of an observer… would not appear to him as indecent and unnatural.” Five decades later, IVF proved his point again. It is still opposed in the Catholic Catechism, which complains of “the domination of technology over the origin and destiny of the human person.” Far from enabling the birth of a longed-for child, for the church this reproductive technology creates an indelible stain on the ‘origin and destiny’ of any person in whose conception it is involved.
None of this is to deny that the work being contemplated at Newcastle needs careful consideration of the ethics as well as the safety. But presenting the issues in terms of a confusion of parenthood illustrates that we are trying to make sense of biomedical developments using moral and social contexts that they have already left behind. In an age of advanced prosthetics and transplantation, tissue engineering, and rapid genomic profiling, we need to escape from the tendency to shoehorn our uniqueness into a molecular structure and look for it instead in how we inhabit the world.
Thursday, March 10, 2011
I have reviewed for Nature the thriller Spiral by nanotech expert Paul McEuen. It is somewhat formulaic but great fun. One could quibble that the villains are East Asians, but Paul is pretty harsh (if, I suspect, scarily accurate) on the US military too.
One of the more humdrum obligations I have had to fulfil in the line of duty was to read Michael Crichton’s Prey, a thriller based on the premise of nanotechnological robot swarms run amok. An ingénue in this genre, I found myself comparing the characters’ psychological implausibility with the non-sequitur quirks of figures from myth and legend. But with guns.
Crichton, of course, made millions with his formula. Whether Spiral (Dial Press, New York, 2011) will do the same for Cornell physicist Paul McEuen remains to be seen (the movie rights are already sold), but it deserves to. It is more enjoyable, more palatable, and (as though this matters) boasts impeccable science rather than the half-digested fare that Crichton occasionally seemed to mistake for the real thing.
There’s nothing here, it should be said, that bucks the thriller formula. Indeed, Spiral made me realise that these books already are movies in literary form: every scene is tailored for the screen, and you can’t help but do the casting as you read. The dialogue is based on how people speak in blockbusters, not in life, and there’s the familiar cast: the vulnerable but plucky mother, the clinically ruthless assassin, the sadistic billionaire, the kid in peril, and so on. The race against time, the apocalyptic threat. And just as these films, if done well, offer a great ride, so does Spiral. It’s more fun than Prey or Angels and Demons, and won’t make your toes curl.
The story begins at the end of the Second World War, when young Irish microbiologist Liam Connor is brought on board a US warship to witness the effects of a devastating biological weapon developed by the Japanese: a fungal infection called the Uzumaki, which induces terrible hallucinations and madness and is ultimately fatal. Connor ends up hiding away a tiny vial of the stuff, wrestled from the Japanese engineer Hitoshi Kitano who was responsible for developing it in northern China.
Sixty years later Connor is an octogenarian with a Nobel prize, and still in active research at Cornell. Unknown to the authorities, he has for decades been secretly searching for the cure that he is sure will one day be needed for the Uzumaki. Aware that the nation that holds the cure also possesses a terrible weapon, he is determined to keep his work from the military. Then he is found dead at the bottom of a gorge, apparently having thrown himself over the edge to escape from a mysterious woman caught on CCTV footage. His coded last message to his colleague Jake Sterling, his granddaughter Maggie and her son Dylan, makes them the only people who can prevent a global outbreak of the killer fungus. But who is behind the fiendish scheme to release it?
You can see a lot of this coming, and as usual the climax depends on who can reach the gun fastest, but that doesn’t detract from the compulsive page-turning quality. And as far as the science goes, McEuen shows that the imagination of an inventive scientist is far more interesting than that of a writer who has merely done his homework – here he trumps not only Crichton but his namesake Ian McEwen who peppers his narratives with cutting-edge science, most notably in his recent novel Solar. It’s a delight to watch how McEuen – a world expert in nanoelectronics – has marshalled his knowledge to kit out the technical plot devices: nanotechnology, microbiology, information technology and synthetic biology are all brought into play in a convincing, unforced manner. Devotees of the latest trends will recognize many elements, from genetically engineered oscillating fluorescence to microfluidic labs-on-chips.
I confess that my interest struggles rather more to find purchase with square-jawed, stolid heroes with names like Jake whose physical prowess and ex-army credentials are carefully established in preparation for the gutsy displays that will inevitably be required of them. But that’s the genre, and Jake is a little less tiresomely bland than the wooden leads in Dan Brown and Crichton. A more appealing hero, however, is Cornell University itself, which enjoys a rather touching love letter here from the author. But the stars of the show are, as ever, the villains: the MicroCrawlers that scrabble ominously across the cover, microelectromechanical devices that acquire a seriously bad attitude.
Next time I hope McEuen dares to push harder at the boundaries of the genre. But I certainly hope there will be a next time, if he can escape both the lab bench and the all-consuming jaws of Hollywood.
Tuesday, March 08, 2011
If you’ve been hearing rumours that alien life forms have been found, here’s the story, as I reported for Nature News. It’s a strange universe out there (I'm talking about NASA).
As shown by its latest claim of 'alien bugs', the Journal of Cosmology has at least been an entertaining diversion. But don’t mistake it for real science.
The discovery of alien life forms might reasonably be expected to create headline news. But the media response to the announcement of such a ‘discovery’ in the Journal of Cosmology  has been muted, and mostly dismissive. “Bugs from space? Forget it”, said Science reporter Richard Kerr, while the Los Angeles Times quoted microbiologist Rosie Redfield as saying “Move along folks. There’s nothing to see here.”
These are somewhat more presentable than the comments received by Nature, of which ‘utter nonsense’ is a polite paraphrase. But the real story is stranger than Richard Hoover’s claim to have found fossilized extraterrestrial bacteria. Who is Hoover and what is the Journal of Cosmology and why has NASA been moved to officially distance itself from the affair?
That Hoover can rightly claim to be a NASA scientist may sound impressive to the media, but most scientists know that the space agency is a morass of odd ideas squirming below its gleaming surface. This of course goes with the territory: folks who dedicate their lives to the exploration of space tend to be bold, even extravagant thinkers, many of them today the children of the science-fiction fantasies of the 1950s and 60s, and the kind of imagination that can put people on Mars is bound to put a lot of other weird stuff out there too.
Hoover is himself an engineer and astrobiologist at NASA’s Marshall Space Flight Centre in Huntsville, Alabama, and he has been pushing this claim for years. “Personally, I have a completely open mind”, says meteoriticist Ian Wright of the Open University in Milton Keynes, England. “The problem for Hoover is that no matter how many papers he writes on this subject, people will only begin to accept the findings when they are replicated by others.”
Hoover’s paper reports curious microscopic filamentary structures seen inside a number of carbon-rich meteorites, including the classic Orgeuil meteorite that fell in France in 1864 and was examined by Louis Pasteur among others. These filaments have a carbon-rich coat filled with minerals, and Hoover points out that they look remarkably similar to structures formed by living and fossil cyanobacteria.
This may be so, but it doesn’t prove that the bacterial forms – if they are that – are extraterrestrial. Hoover says that because the structures are buried deep inside the meteorites, it is unlikely that they represent contamination by microorganisms on Earth. Experts don’t buy this. “Contaminants can easily get inside carbonaceous meteorites as they are relatively porous”, says Iain Gilmour of the Open University, who points to direct evidence of this for at least one other carbon-rich meteorite.
Meteoriticist Harry McSween of the University of Tennessee agrees. “All of us who have studied meteorites, especially CI chondrites [the class studied by Hoover], are aware that they have been terrestrially contaminated”, he says.
In fact, claims very similar to Hoover’s were made in the 1960s by the chemist Bartholomew Nagy, leading to a high-profile debate which left a consensus that Nagy’s ‘life-like’ structures were the result of contamination by pollen grains. Similar assertions of bacteria-like fossil forms in a Martian meteorite, made by NASA scientists in 1996 , have also been judged inconclusive.
If Hoover’s report is so unconvincing, why was it published? The Journal of Cosmology asserts that all its papers are peer-reviewed, but also states that “Given the controversial nature of [Hoover’s] discovery, we have invited 100 experts and have issued a general invitation to over 5000 scientists from the scientific community to review the paper and to offer their critical analysis… No other paper in the history of science has undergone such a thorough analysis.” This is a decidedly unorthodox publication strategy, not least because many of the ‘commentaries’ published so far by the journal seem more like the kind of thing one would find on fringe blogs.
Doubtless this is why NASA has been embarrassed into releasing a disclaimer about the work. “NASA cannot stand behind or support a scientific claim unless it has been peer-reviewed or thoroughly examined by other qualified experts”, it says. “NASA was unaware of the recent submission of the paper to the Journal of Cosmology or of the paper's subsequent publication.”
But the Journal of Cosmology is no ordinary journal. It has been running for just two years under the leadership of astrophysicist Rudolf Schild of the Harvard-Smithsonian Center for Astrophysics, and is a torch-bearer for the hypothesis of panspermia, according to which life on Earth was seeded by organisms brought here from other worlds. This was a favourite theory of the maverick astrophysicist Fred Hoyle and his colleague N. C. Wickramasinghe (an executive editor of the journal), who have argued that alien viruses could explain flu epidemics. Other highlights of the journal include an article titled ‘Sex on Mars’, which asks the burning question: have astronauts ever had sex, and is it safe?
A press release from the journal has now announced that it will cease publication in May, claiming to have been “killed by thieves and crooks”. The journal’s success “posed a direct threat to traditional subscription based science periodicals”, says senior execute managing director Lana Tao. “JOC was targeted by Science magazine and others who engaged in illegal, criminal, anti-competitive acts to prevent JOC from distributing news about its online editions and books.”
If JOC is no more, this is arguably a shame, since there ought to be space for such entertaining and eccentric voices. It’s true that apparently authentic journals like this might muddy the public’s distinction between real science and half-baked speculation; but judging from the latest episode, the world (apart from Fox News) is not as gullible as all that.
1. Hoover, R. B. J. Cosmol. 13 [no pages] (2011).2. McKay, D. S. et al., Science 273, 924-930 (1996).
Friday, March 04, 2011
Here’s a little piece I wrote for Nature news. To truly appreciate this stuff you need to take a look at the slideshow. There will be a great deal more on early microscopy in my next book, probably called Curiosity and scheduled for next year.
The first microscopes were a lot better than they are given credit for. That’s the claim of microscopist Brian Ford, based at Cambridge University and a specialist in the history and development of these instruments.
Ford says it is often suggested that the microscopes used by the earliest pioneers in the seventeenth century, such as Robert Hooke and Antony van Leeuwenhoek, gave only very blurred images of structures such as cells and micro-organisms. Hooke was the first to record cells, seen in thin slices of cork, while Leeuwenhoek described tiny ‘animalcules’, invisible to the naked eye, in rain water in 1676.
The implication is that these breakthroughs in microscopic biology involved more than a little guesswork and invention. But Ford has looked again at the capabilities of some of Leeuwenhoek’s microscopes, and says ‘the results were breathtaking’. ‘The images were comparable with those you would obtain from a modern light microscope’, he adds in an account of his experiments in Microscopy and Analysis .
“It's a very trustworthy and interesting article”, says Catherine Wilson, a historian of microscopy at the University of Aberdeen on Scotland. “Ford is the world’s leading expert on the topic and what he has to say here makes a good deal of sense”, she adds.
The poor impression of the seventeenth-century instruments, says Ford, is due to bad technique in modern reconstructions. In contrast to the hazy images shown in some museums and television documentaries, careful attention to such factors as lighting can produce micrographs of startling clarity using original microscopes or modern replicas.
Ford was able to make some of these improvements when he was granted access to one of Leeuwenhoek’s original microscopes owned by the Utrecht University Museum in the Netherlands. Leeuwenhoek made his own instruments, which had only a single lens made from a tiny bead of glass mounted in a metal frame. These simple microscopes were harder to make and to use than the more familiar two-lens compound microscope, but offered greater resolution.
Hooke popularized microscopy in his 1665 masterpiece Micrographia, which included stunning engravings of fleas, mites and the compound eyes of flies. The diarist Samuel Pepys judged it ‘the most ingenious book that I ever read in my life’. Ford’s findings show that Hooke was not, as some have imagined, embellishing his drawings from imagination, but should genuinely have been able to see such things as the tiny hairs on the flea’s legs.
Even Hooke was temporarily foxed, however, when he was given the duty of reproducing the results described by Leeuwenhoek, a linen merchant of Delft, in a letter to the Royal Society. It took him over a year before he could see these animalcules, whereupon he wrote that ‘I was very much surprised at this so wonderful a spectacle, having never seen any living creature comparable to these for smallness.’
‘The abilities of those pioneer microscopists were so much greater than has been recognized’ says Ford. He attributes this misconception to the fact that ‘no longer is microscopy properly taught.’
1. Ford, B. J. Microsc. Anal. March 2011 (in press).
Wednesday, March 02, 2011
I have a comment on the Prospect blog about the production of Frankenstein at the National Theatre, which I saw this week. To save you a click, here it is anyway. I am reviewing the play more formally for Nature. It’s flawed but worth seeing – but if you haven’t got a ticket, tough luck, as it’s sold out. However, I believe you could still come to this.
Do not go to see the Monstrous Drama, founded on the improper work called FRANKENSTEIN!!! Do not take your wives, do not take your daughters, do not take your families!!!
Actually, although the latest adaptation of Mary Shelley’s story at the National Theatre, scripted by Nick Dear and directed by Danny Boyle, includes nudity and a rape that would certainly not have featured in the 1823 staging that prompted this warning, there is little here that would shock most wives and daughters. Even the Grand Guignol gore in the draft script has been toned down. One scene even turns into a dance routine like some monstrous hybrid of Oliver! and The Rocky Horror Picture Show.
None of this is a bad thing. Some is very good: the staging is spectacular, the adaptation largely thoughtful and the monster – I can comment only on Jonny Lee Miller’s version in the show’s alternation of lead roles – is the most inventive and heartfelt I have seen, owing something to Caliban, Charles Laughton’s Hunchback of Notre Dame and even the Elephant Man. Some of the secondary performances creak, and some of the dialogue is throwaway, but the main problem is the title character.
Benedict Cumberbatch, who played Victor Frankenstein on the night I saw it, did all a versatile, intelligent actor of his calibre could be expected to do with the lines he was given. But about halfway through the production, the penny dropped as to why he seemed to be struggling. He is the Mad Scientist.
True, he does not cackle like Gene Wilder or shriek Colin Clive’s line from James Whale’s seminal movie – ‘Now I know what it feels like to be God!’ But that’s part of the problem: not even naked madness motivates his egotistical quest, his utter neglect of his doting fiancée, his contempt for the ‘little men with little lives’, his lack of real anguish about his child brother’s murder. From the outset it is clear that he is a stranger to human feeling and has not the slightest real interest in developing his knowledge of reanimation for ‘medical research’. Set against a creature who we see develop from its ‘birth’ and first baby steps to a state of savage grace and wisdom, all the time spurned and despised for looking no worse than a person flung through a windscreen, there is never any doubt who is the real monster.
I don’t think it makes much sense for scientists to feel indignant at this portrayal. Frankenstein has for so long been the archetype of the mad scientist that another representation as literal as this can’t elaborate on that image. And anyone who could entertain the notion that this cold, amoral individual experimenting in misanthropic solitude for nothing but personal glory bears the slightest resemblance to the modern scientist is already too biased and ignorant to argue with. This Frankenstein is a fairy-tale figure, like the wicked witch or the evil stepmother. The only harm this can do today is in dramatic terms: villains need to be either more complex or more exuberantly depraved to work as central characters. For all its virtues, Nick Dear’s adaptation in the end takes the easier option in making us love the monster. A production that tries to make us feel sympathy for Victor, a useless but confused and struggling father – now that would be an interesting challenge.