Here’s essentially a brief overview of my new book Curiosity, published this month. The piece appears in the latest issue of New Humanist.
____________________________________________________________
The Abel Prize, the “mathematics Nobel” awarded by the Norwegian Academy of Sciences, always goes to some pretty head-scratching stuff. But the arcane number theory of this year’s winner, Endre Szemerédi, has turned out to have important applications in computer science: a validation, according to the Academy’s president Nils Stenseth, of purely “curiosity-driven” research.
It’s a common refrain in science: questions pursued purely from a desire to know about the world have unforeseen practical applications. This argument has been advanced to justify the $6 bn Large Hadron Collider at the European particle-physics centre CERN, which, according to CERN’s former Director General Robert Aymar is “continuing a tradition of human curiosity that’s as old as mankind itself.” At a time when the UK physical sciences research council is starting to demand absurd “impact assessments” for grant applications, this defence of science motivated by nothing more than inquisitiveness is essential.
But Aymar’s image of a long-standing “tradition of curiosity”, although widely shared by scientists, is too simplistic. There’s evidently an evolutionary benefit in wanting to explore our environment – we’re not the only animals to do that. But curiosity is a much more subtle, many-faceted notion, and our relationship to it has fluctuated over the ages. We are unlikely to do justice to what curiosity in science could and should mean today unless we understand this history.
For one thing, the word itself has had many meanings – too many, in fact, to identify any core concept at all. A “curious” person could indeed be an inquisitive one, but could equally be one who simply took care (Latin cura) in what they did. Not just people but objects too might be described as “curious”, and this might mean that they were rare, exotic, elegant, collectable, valuable, small, hidden, useless, expensive – but conversely, in certain contexts, common, useful or cheap. From the late sixteenth century, European nobles and intellectuals indulged a cult of curiosities, amassing vast collections of weird and wonderful objects which they displayed in room-sized ‘cabinets’. A typical cabinet of curiosities, like that of Charles I’s gardener John Tradescant in Lambeth, might contain all manner of rare beasts, shells, furs, minerals, ethnographic objects and exquisite works of craftsmanship. This spirit of collecting, usually biased towards the strange and wonderful rather than the representative, infused early science – the Royal Society had its own collection – and it gave rise to the first public museums. But it also made some early scientists focus on peculiar rather than ordinary phenomena, which threatened to turn them into bauble collectors rather than investigators of nature.
This enthusiasm for curiosities was something new, and arose outside of the mainstream academic tradition. Until the late Renaissance, curiosity in the sense that is normally implied today – investigation driven purely by the wish to know – was condemned. In ancient Greece it was seen as an unwelcome distraction rather than an aid to knowledge. For Aristotle, curiosity (periergia) had little role to play in philosophy: it was a kind of aimless, witless tendency to pry into things that didn’t concern us. Plutarch considered curiosity the vice of those given to snooping into the affairs of others: the kind of busybody known in Greek as a polypragmon.
In early Christianity it was worse than that. Now curiosity was not merely frowned upon but deemed sinful. “We want no curious disputation after possessing Christ Jesus”, wrote the second-century Christian apologist Tertullian, “no inquisition after enjoying the gospel.” The Bible told us all we needed – and should expect – to know.
Scripture made it clear that there were some things we were not supposed to know. God was said to have created Adam last so that he would not see how the rest of the job was done. Desire for forbidden knowledge led to the Fall. The transgressive aspect of curiosity is an insistent theme in Christian theology, which time and again demanded that one respect the limits of inquiry and be wary of too much learning. ‘The secret things belong to the Lord our God’, proclaims Deuteronomy, while Solomon declares in Ecclesiastes that we should “be not curious in unnecessary matters, for more things are shewed unto thee than men understand.”
In the hands of Augustine, curiosity became a “disease”, one of the vices or lusts at the root of all sin. “It is in divine language called the lust of the eyes”, he wrote. “From the same motive, men proceed to investigate the workings of nature, which is beyond our ken – things which it does no good to know and which men only want to know for the sake of knowing.” He claimed that curiosity is apt to pervert, to foster an interest in “mangled corpses, magical effects and marvellous spectacles.”
There was, then, a lot of work to be done before the early modern scientists of the seventeenth century – men like Galileo, Johannes Kepler, Robert Boyle, Robert Hooke and Isaac Newton – could give free rein to their curiosity. Needless to say, despite popular accounts of the so-called Scientific Revolution which imply that these men began to ask questions merely because of their great genius, there were many factors that emancipated curiosity. Not least was the influence of the tradition of natural magic, which insisted that nature was controlled by occult forces (literally invisible, such as magnetism and gravity) that could furnish a rational explanation of even the most marvellous things. This tradition had a strong experimental bias, denied the cosy tautologies of academic Aristotelianism, and was determined to uncover the “secrets” of nature.
The discovery of the New World, and the age of exploration in general, also opened minds with its demonstration that there was far more in the world than was described in the books of revered ancient philosophers. Accounts of investigations with telescopes and microscopes by the likes of Galileo and Hooke make reference to the “new worlds” that these devices reveal at both cosmic and minute scales, often presenting these studies as voyages of discovery – and conquest – comparable to that of Columbus.
But this liberation of curiosity was more complicated than is sometimes implied. For one thing, it forced the issue of how to assess evidence and reports – whose word could be trusted? Scientists like Boyle began to develop what historian Steven Shapin has called a “literary technology” designed to convey authority with rhetorical tricks, such as the dispassionate, disembodied tone that now characterizes, some might say blights, the scientific literature. Curiosity became apt to be laughed at rather than condemned: during the Restoration and the early Enlightenment, writers such as Thomas Shadwell, Samuel Butler and Jonathan Swift wrote satires mocking the Royals Society’s apparent fascination with trivia, such as the details of a fly’s eye.
And the problem with curiosity is that it can be voracious: the questions never cease. Everything Hooke put into his microscope looked new and strange. Boyle lamented that curiosity provoked disquiet and anxiety because it goaded people on without any prospect of comprehending all of nature in one person’s lifetime. Like others, he drew up “to do” lists that are more or less random and incontinent, showing how hard it was to discipline curiosity into a coherent research programme.
Today we continue this slightly uneasy dance with curiosity. Not just curiosity but also its mercurial cousin wonder are enlisted in support of huge projects like the LHC and the Hubble Space Telescope. But, however well motivated they are, one has to ask how much space is left in huge, costly international collaborations like this for the sort of spontaneous curiosity that would allow Hooke and Boyle to follow their noses: can we really have “curiosity by committee”? That’s why we shouldn’t let Big Science blind us to the virtues of Small Science, of the benchtop experiment, often with cheap, improvised equipment, that leaves space for trying out hunches and wild ideas, revelling in little surprises, and indulging in science as a craft. Such experiments may turn out to be fantastically useful, or spectacularly useless. They are each little acts of homage to curiosity, and in consequence, to our humanity.
5 comments:
Could it not be the rise of the middle-classes; more folk with disposable time and money? As even the most basic of equipment such as optics or thermometers, would have cost a small ransom. Prior to this, people lived hand to mouth, and any distraction to the duty of living, would have been detrimental to life.
Regarding the reformation and the enlightenment, both periods spawned extreme intolerant movements such as puritanism and Robespierre (what ever movement he represented). The notion of experimental curiosity might be in conflict with the spirit of absolute logic (Descartes et al); and that ironically the age of enlightenment may have stymied the early experimentalists, as inductive discovery lacked the mathematical discipline of deductive logic. Something that resonates today between experimental research, and mathematical / computational science modelling.
I cover some of that, certainly insofar as the Cartesian approach was rather different from the more eclectic and haphazard experimental philosophy. Increasing prosperity surely played a part too. And I do challenge the common notion that the Scientific Revolution was inevitably allied to a rigorously mathematical approach, on which earlier historians of science such as Alistair Crombie have insisted.
Feynman had a simple way of putting it:
http://www.youtube.com/watch?v=lmTmGLzPVyM
This one's pretty good too:
http://www.youtube.com/watch?v=cRmbwczTC6E
Post a Comment