This story was in yesterday’s Guardian in slightly edited form. It was accompanied by a review of some of Iamus’s music by Tom Service, who was not terribly impressed. It’s a shame that Tom had only Hello World! to review, since that was an early piece by Iamus and so very much a prototype – things have moved on since then. I think his review was quite fair, but I had a sense that, knowing it was made by computer, he was looking out for the “computer-ness” in it. This bears on the final part of my story below, for which there was no room in the Guardian. I think one can detect a certain amount of ‘anti-computer prejudice’ in the Guardian comments thread too, though that is perhaps no stronger than the general ‘anti-modernist’ bias. I’d be interested to see what Tom Service makes of the CD when it appears later this year. I carry no torch for Iamus as a composer, but I must admit that I’m growing fond of it and certainly feel it is a significant achievement. Anyway, there will be more on this soon – I’m writing a different piece on the work for Nature, to appear in August.
_______________________________________________________
As soon as you see the title of Iamus’s composition “Transits – Into an Abyss”, you know it’s going to be challenging, modernist stuff. The strings pile up discords, now spooky, now ominous. But if your tastes run to Bartók, Ligeti and Penderecki, you might like it. At least you have to admit that this bloke knows what he’s doing.
But this bloke doesn’t know anything at all. Iamus is a computer programme. Until the London Symphony Orchestra was handed the score, no human had intervened in preparing this music.
“When we tell people that, they think it’s a trick”, says Francisco Vico, leader of the team at the University of Malaga who devised Iamus. “Some say they simply don’t believe us, others say it’s just creepy.” He anticipates that when Iamus’s debut CD is released in September, performed by top-shelf musicians including the LSO, it is going to disturb a lot of folk.
You can get a taste of Iamus’s oeuvre before then, because on 2 July some of Iamus’s compositions will be performed and streamed live from Malaga. The event is being staged to mark the 100th anniversary of the birth of Alan Turing, the man credited with more or less inventing the concept of the computer. It was Turing who devised the test to distinguish humans from artificial intelligence made famous by the opening sequence of Ridley Scott’s Blade Runner. And the performance will itself be a kind of Turing test: you can ask yourself whether you could tell, if you didn’t know, that this music was made by machine.
Iamus composes by mutating very simple starting material in a manner analogous to biological evolution. The evolving compositions each have a kind of musical core, a ‘genome’, which gradually becomes more complex. “Iamus generates an initial population of compositions automatically”, Vico explains, “but their genomes are so simple that they barely develop into a handful of notes, lasting just a few seconds. As evolution proceeds, mutations alter the content and size of this primordial genetic material, and we get longer and more elaborated pieces.” All the researchers specify at the outset is the rough length of the piece and the instruments it will use.
“A single genome can encode many melodies”, explains composer Gustavo Díaz-Jerez of the Conservatory of the Basque Country in San Sebastian, who has collaborated with the Malaga team since the outset and is the pianist on the new recordings. “You find this same idea of a genome in the Western musical canon – that’s why the music makes sense.”
The computer doesn’t impose any particular aesthetic. Although most of its serious pieces are in a modern classical style, it can compose in other genres too, and for any set of instruments. The ‘darwinian’ composition process also lends itself to producing new variations of well-known pieces [PB: I’ve been sent some great variants of the Nokia ringtone] or merging two or more existing compositions to produce offspring – musical sex, you might say.
Using computers and algorithms – automated systems of rules – to make music has a long history. The Greek composer Iannis Xenakis did it in the 1960s, and in the following decade two Swedish composers devised an algorithm for creating nursery-rhyme melodies in the style of Swedish composer Alice Tegnér. In the 1980s computer scientist Kemal Ebcioglu created a program that harmonised chorales in the style of Bach.
As artificial intelligence and machine learning became more sophisticated, so did the possibilities for machine music: now computers could infer rules and guidelines from real musical examples, rather than being fed them to begin with. Computer scientist John ‘Al’ Biles devised an algorithm called GenJam that learns to improvise jazz. A trumpeter himself, Biles performs live alongside GenJam under the name the Al Biles Virtual Quintet, but admits that the algorithm is a rather indifferent player. The same is true of GenBebop, devised by cognitive scientists Lee Spector and Adam Alpern, which improvises solos in the style of Charlie Parker by ‘listening’ to him and iterating its own efforts under the ultimately less-than-discerning eye of an automated internal critic.
One of the most persuasive systems was the Continuator, devised by François Pachet at Sony’s Computer Science Laboratory in Paris. In a Turing test where the Continuator traded licks with an improvising human pianist, expert listeners were mostly unable to guess whether it was the human or the computer playing.
But these efforts still haven’t shown that a computer can make tolerable music from scratch. One of the best known attempts is ‘Emily Howell’, a programme created by music professor David Cope. Yet Howell’s bland, arpeggiated compositions sound like a technically skilled child trying to ape Beethoven or Bach, or like Michael Nyman on a bad day: fine for elevators but not for the concert hall.
This is why Iamus – named after the mythical son of Apollo who could understand the language of birds – is different. This seems to be the first time that music composed purely by computer has been deemed good enough for top-class performers to play it. Díaz-Jerez admits that the LSO were “a little bit sceptical at the beginning, but were very surprised” by the quality of what they were being asked to play. The soprano Celia Alcedo, he says, “couldn’t believe the expressiveness of some of the lines” she was given to sing.
Lennox Mackenzie, the LSO’s chairman, had mixed feelings about the orchestral pieces. “I felt it was like a wall of sound”, he says. “If you put a colour to it, this music was grey. It went nowhere. It was too dense and massive, no instrument stuck out at any point. But at the end of it, I thought it was quite epic.”
“The other thing that struck me”, Mackenzie adds, “was that it was festooned with expression marks, which just seemed arbitrary and meaningless. My normal inclination is to delve into music and find out what it’s all about. But here I don’t think I’d find anything.” But he’s far from discouraging. “I didn’t feel antipathy towards it. It does have something. They should keep trying, I’d say.”
What is slightly disconcerting is that Iamus can produce this stuff endlessly: thousands of pieces, all valid and musically plausible, all fully notated and ready to play, faultless from a technical point of view, and “many of them great”, according to Díaz-Jerez. Such profligacy feels improper: if it’s that easy, can the music really be any good? Yet Díaz-Jerez thinks that the pieces are often better in some respects than those produced by some avant-garde composers, which might revel in their own internal logic but are virtually impossible to play. And crucially, different people have different favourites – it’s not as though the programme just occasionally gets lucky and turns out something good.
How does a performer interpret these pieces, given that there’s no “intention” of the composer to look for? “Suppose I found a score in a library without knowing who wrote it”, says Díaz-Jerez. “I approach these pieces as I would that one – by analysing the score to see how it works.” In that respect, he sees no difference from deducing the structure of an intricate Bach fugue.
You can compare it with computer chess, says philosopher of music Stephen Davies of the University of Auckland in New Zealand. “People said computers wouldn't be able to show the same original thinking, as opposed to crunching random calculations. But now it’s hard to see the difference between people and computers with respect to creativity in chess. Music too is rule-governed in a way that should make it easily simulated.”
However, Iamus might face deeply ingrained prejudice. Brain-scanning studies by neuroscientists Stefan Koelsch and Nikolaus Steinbeis have shown that the same piece of music played to listeners elicits activity in the parts of the brain responsible for ascribing intentions to other humans if they are told that it was composed by a human but not if they are told it was computer-generated. In other words, it matters to our perceptions of expressiveness how we think the music was made. Perhaps Iamus really will need to be marketed as a pseudo-human to be taken seriously.
Dear Philip,
ReplyDeleteI really like your post about Iamus. On our Switzerand-based network and online journal www.norient.com we discuss digital music too and it would be great if we could republish your article on our site.
I'm looking forward to hearing from you,
(contact[at]norient.com)
Best,
Theresa