Sunday, December 08, 2013

Quantum computers: when, what, who and why

I have a piece in December’s Prospect on quantum computing – here’s the original draft.

__________________________________________________

When people first hear about quantum computers, a common response is “where and when can I get one?” But that’s the wrong question, and not just because you’ll be disappointed with the answer. Quantum computers are often said to promise faster, bigger, more multi-layered computation – but they are not, and might never be, an upgrade of your laptop. They’re just not that sort of machine. So what are they, and why do we want them?

You could argue that your laptop is already a quantum computer, because the laws of quantum physics govern the ways electrical currents pass through its ultra-small transistors and wires. Partly that’s just saying that ultimately quantum physics governs all the properties of materials. Increasingly, however, strange quantum effects that don’t usually manifest in the everyday world, such as the ability of electrons to leap through walls, are becoming important as the scale of microelectronics shrinks. This ‘quantum tunnelling’, for example, is the basis of flash memory, and also threatens to make transistors ‘leaky’ as they get ever smaller.

Real quantum computers go far beyond any of that, however. In the end, all of today’s computers work using old-fashioned binary logic: by encoding information in strings of 1’s and 0’s, represented for example by electrical pulses in circuits or by flashes of light in optical fibres. These so-called bits are manipulated in logic gates, built from electronic components such as transistors. Here a particular set of input bits prompt the gate to produce another set of output bits. That’s what computation is; the rest is a question of building software and interfaces that turn these bits into a letter to Mum glowing on the screen.

Quantum computers will also use 1’s and 0’s, but with a crucial difference. As well as having one or the other of these values, a quantum bit (qubit) could have any mixture of them. Counter-intuitively, it can be simultaneously a 1 and a 0, or 1 with a tiny bit of 0, and so on. These mixtures are called superpositions, and they are a fundamental feature of objects that obey quantum rules. A photon of light, for example, can be polarized either vertically or horizontally, or can be in a superposition of both polarizations.

That gives qubits access to a vast range of states, so you can encode much more information in them. [OK, I’m keeping this in for now in the interests of honesty to the moment – but watch this space for an explanation of why this is far too simplistic, and perhaps even too erroneous, a way to describe quantum computing…] In short, it enables quantum computers to perform very many calculations simultaneously where a classical computer can do only one at a time with any given set of bits. It is this that provides the quantum computer with its tremendous speed-up. To factorize a big number classically (to find all its divisors), a computer plods through all the possible answers, while a quantum computer can assess them all, encoded in superpositions of qubits, at basically the same time.

So where’s the catch? It is that quantum phenomena such as superpositions are generally very delicate. They get easily disrupted or destroyed by disturbances from the surrounding environment, particularly the randomizing effects of heat. So to make such states usually requires very low temperatures. This fragility of quantum effects means that, while the question of what you could do with a quantum computer has been explored extensively already by physicists and mathematicians, actually building a device that can do any of it is taxing electrical engineers and applied physicists to the limit.

Now there are signs of real progress. The community was set buzzing two years ago when a Canadian company called D-Wave (“the world’s first commercial quantum computing company”) announced that it had created the first practical quantum computer: a black box, if you will, that could actually solve stuff. But several researchers questioned whether D-Wave’s device was really a true quantum computer at all, or just a fancy box of tricks that made token nods towards quantum effects. It employs an approach called ‘quantum annealing’, which is different from most theories of quantum computing and for which any real advantages over classical computing have yet to be shown.

At Raytheon BBN Technologies, based in Cambridge, Massachusetts, researchers are convinced that they are closing in on the real thing. Conveniently close to Harvard and the Massachusetts Institute of Technology, BBN was founded in 1948 and was intimately involved in the development of the earliest military networks that became the Internet. In 2009 the company became a subsidiary of the US defence contractor Raytheon. It has been seeking to develop so-called quantum information technologies since 2001, when the company’s researchers devised an optical telecommunications network that could exchange light signals between their headquarters and nearby Harvard and Boston Universities that encoded information in superpositions of photons. Such networks, which could be immune to eavesdropping, have now been developed in many places in the world.

But the quantum computer, which actually does number-crunching, is a bigger challenge. To make qubits, Raytheon BBN uses the same fundamental circuit components as D-Wave does. Called superconducting Josephson junctions, these are metal contacts cooled so deeply that they have become superconductors (that is, they have no electrical resistance), electrically connected to each other via a thin barrier of insulating material. Superconductivity is itself a quantum-mechanical effect, which is why it requires low temperatures, and the superconducting current can flow in distinct quantum states. A Josephson junction helps to filter out all but two states, which correspond to the binary 1’s and 0’s. It is possible to manipulate these states, for example creating specific superpositions, using pulses of microwave radiation. That’s the physical basis of BBN’s qubit circuits, which have to be cooled to within a daunting 50 thousandths of a degree of absolute zero.

Even then, the superpositions don’t last long. Yet to do practical quantum computing, they only need to survive for at least as long as the time needed to juggle with them in quantum logic gates. In recent years, says Zachary Dutton, lead scientist of Raytheon BBN’s Quantum Information Processing group, these so-called coherence times have increased dramatically, and are now at a level – tens to hundreds of microseconds – where the devices can actually perform logic processing.

Another critical issue for these quantum gates is the so-called error rate: how accurately they can be switched between states by the microwave signal. If you get this a little wrong – say, by making too much of one state in the superposition – the errors accumulate until, even if one stores the same information several times for cross-checking, too many mistakes derail the whole computation. Getting the error rate small enough to avoid this remains one of the key tasks.

At present the Raytheon BBN team, who are collaborating with computer giants IBM, doesn’t have anything even vaguely like a quantum computer. Rather, they are focusing on getting very small systems – currently three qubits, but soon to be eight – to work well enough that they can be assembled into large-scale circuits. “If you looked at a circuit diagram of a quantum computer”, says Dutton, “this would be a little piece of it.” The extreme cooling “needn’t be a showstopper”, he adds, because refrigeration technologies have advanced so much in recent years, for example so that they don’t need constant refilling with a coolant such as liquid helium.

Exotic quantum states in ultracold superconducting wires might sound like a complicated basis for making qubits. But the same approach is being taken by several of the leading academic centres of quantum computing, including MIT, Yale and the University of California at Santa Barbara. It’s by no means the only option. Another popular approach, for example, is to encode information in the quantum-mechanical energy states of individual atoms or ions suspended in free space using electromagnetic fields to trap them there. The information can be programmed, manipulated and read out using lasers to probe and alter the states of the trapped ions. Christopher Monroe, who is using this approach at the University of Maryland, feels that “there will be some interesting results in the next several years in both Josephson junction and [ion-trap] atomic machines”. He concurs that, unlike the 512-qubit D-Wave devices, those under development at Raytheon BBN are “legitimately quantum”.

What would you use a quantum computer for? Monroe says that the first demonstrations of quantum computing will probably be solving “some esoteric physics problem”, not providing a general-purpose computer. There are, however, some important possible uses that anyone can appreciate. Fast factorizing of huge numbers is one such, since all current data encryption methods rely on the difficulty of doing this with classical computers. Quantum computers would change the whole game in data security.

For basic science, one of the most appealing applications would be to perform computer simulations of molecules and materials. These are governed by quantum rules, and classical computers are forced to solve the equations by laborious and merely approximate mathematical methods. Quantum computers, in contrast, could map such quantum behaviour directly and exactly into its algorithms, so that simulations that take days currently might be possible in seconds, helping to make better predictions of the properties of new drugs and materials.

Currently, the most taxing computational problems are tackled by massive, expensive supercomputers housed in a few specialized centres and leased to users. That’s what the initial market for quantum computers will look like too, says Dutton – not really a market at all, but a highly centralized oligopoly. But of course all computers used to be like this: huge mainframes dedicated to recondite problems. Mindful of IBM founder Thomas Watson’s (possibly apocryphal) prediction in 1943 that this is what computers would always be – Watson is said to have forecast a world market for perhaps five of them in total – it would be an unwise prophet who forecasts where quantum computers might be decades down the line.

No comments: