Cracking the qubit conundrum: Advances in quantum computing

6 min read

Most digital electronics relies on a bit of information being stored on a clump of charge – usually a bundle of electrons which might contain tens of thousands of individual particles per bit.

Although it has been pointed out by Likharev that the first manipulation of individual electrons was by Millikan in 1909, it wasn't really until the late 1980s that it became apparent that the ultimate scaling of electronics – the embodiment of one bit on one electron – might be possible. The first results were based on nanofabricated devices operating at milliKelvin temperatures, but the field moved remarkably quickly, with the first research single electron memories announced in 1993, logic in 1995 (Nakazato, Hitachi-Cambridge) and 128Mbit random access memory circuits in 1998 (Yano, Hitachi). Since then, the single electron, or few electron, memory has been in the semiconductor roadmap for the appropriate lithographic dimension, although single electron logic remains impractical. Quantum Information Processing (QIP), a new information science based on the principles of quantum mechanics, includes quantum computing and quantum cryptography. It represents the next generation of understanding of information processing, which started when Alan Turing came up with the idea of the universal computing machine in the 1940s. A few years later, when early computers were being made and used, it became clear that, to process information, the information must be embodied in a physical system, with the associated strengths and weaknesses of that system. The most obvious example of this in current computers is a consequence of the second law of thermodynamics – the inevitable problem of power dissipation. Richard Feynman is generally credited with first postulating, in 1981, that as the physical entities on which we embody information become smaller, they ultimately become quantum mechanical, with all the strange behaviour that that implies. While the first publication on this was probably by Roman Ingarden in 1976, by 1985, David Deutch had described the universal quantum computing machine, the first quantum analogue of Turing's invention. Since then, the race has been on to develop a real system capable of performing quantum computation, with many research groups in companies and universities around the world actively working to develop this exciting new science into a realistic new information technology. In a classical computer, the basic unit of information is the 'bit', which can exist in one of two possible states – for example, yes/no (0 or 1 in binary language). Quantum computers make use of quantum bits (qubits), which can exist in a superposition of both states – a mixture of both 0s and 1s simultaneously. Qubits are also subject to quantum entanglement. When two or more qubits are entangled, they behave as one system, so the state of one qubit depends directly on the state of the others. Entanglement has the consequence that the potential processing power of a quantum information system increases exponentially with the number of qubits, rather than linearly in a classical system. The reason for the interest is not that quantum computing offers much faster computers, rather it brings a whole new way of processing information, particularly the possibility of solving certain types of problem that cannot be handled by conventional computers. The usual example is factorisation: much conventional cryptography relies on the fact that it is relatively easy to multiply two large prime numbers together, but practically impossible to perform the reverse operation if you only know the product. A suitably large quantum computer would be able to do this very quickly. A view of the power of a quantum computer is that 'suitably large' here means that a computer of about 100 perfect qubits could outstrip any existing classical computer. In practice, we know that we can never build such a perfect system, but there are quantum error correction approaches which are equivalent to the error correction approaches used in classical computing. When this overhead is included, the breakpoint moves to about 10,000 qubits, still a tiny fraction of the devices in a present day microprocessor. One of the key properties of a qubit is its coherence time: a measure of how long it can be preserved in a particular state without losing information through randomising interactions. This needs to be considerably longer than the time required to switch the qubit between states; again, the breakpoint ratio is approximately 10,000. Although the principles behind quantum computing have been established and small model systems constructed, it still remains a considerable task to scale these up to practical, working computers. This is certainly worth doing, however, as it would enable certain types of computation that are currently, if not impossible using classical computers, then certainly impractical within a sensible timescale. A raft of potential applications includes bioinformatics, molecular modelling, codebreaking and encryption. Quantum computers could also be used as simulators to solve quantum mechanics problems. There are several challenges to be met before a practical quantum computer can be built. These are often summarised in the Di Vincenzo criteria: • We need a scalable physical system with well defined qubits • It must be initialisable to a simple state, such as all qubits set to 0 • It must have much longer decoherence times than the switching time (approximately 10^4 times) • There must be a universal set of quantum gates, and • It must permit high quantum efficiency, qubit specific measurements. Many approaches are being taken to implementing these criteria, which are summarised in the US Government Roadmap for Quantum Computing, the latest version being produced for ARDA in 2004 (qist.lanl.gov). These roughly divide into: nuclear magnetic resonance (NMR); trapped atom/ion; optical; and solid state. The current state of the art illustrates how embryonic the field is: while seven qubits can be entangled in NMR, there are fundamental reasons why this cannot be extended. Only three entangled qubits have been demonstrated in the solid state, recently in 2010, in superconducting systems [Nature 467, 513 (2010)]. Thus, the field seems wide open for the best physical system to be identified. Though it is relatively simple to define a system of qubits to satisfy the first two criteria, it is not trivial to maintain quantum entanglement; the link between qubits needs to be maintained for long enough to carry out calculations to satisfy the third. For the last two criteria, the hardware architecture used needs to be able to produce uniform qubits and to enable the user to manipulate them without interfering with the conditions needed for quantum information processing. Cryogenics may provide a means of achieving a working quantum computer, with ultra low temperatures preserving quantum coherence and removing any thermal processes that could interfere with computations. Currently, most solid state approaches have to be tested at milliKelvin temperatures, with 4.2K or lower needed for the first operational computers. Cooling classical computers to improve their performance has been postulated many times in the history of computing, but has never proved economically viable. However, we might speculate that the first quantum systems will be useable in large 'supercomputer' facilities and the immense power will justify the initial expense. In addition, there are currently only three suitable algorithms available to make use of this potentially phenomenal power. Each of these was developed individually, but they offer the tantalising prospect that a more systematic approach could produce many more. Many people have speculated that this field will only really develop when suitable quantum computers are fully available. As an example, Hitachi, together with Cambridge University, has been working on a silicon device for quantum computing: a quantum dot charge qubit. This structure, based on years of work on single electronics, is the first step in the development of a quantum computer based on conventional silicon technology. One approach to building a solid state quantum computer is by exploiting quantum states of artificial atoms and molecules built in semiconductor quantum dot systems. The team has demonstrated this with an isolated double quantum dot as a qubit. The key challenges in producing efficient quantum circuits are: to have a system with sufficiently high number of operations within the characteristic coherence time of the qubits; to control the coupling between qubits to form architectures; and to integrate the qubits with manipulation and measurement circuitry. All operations (initialisation, manipulation and measurement) have been achieved; with electrical gates used for initialisation and manipulation, and a single electron transistor used for measurement. The scheme gives a very long coherence time, (100 times longer than shown in other solid state implementations) and provides design flexibility, since the qubits may be combined in a variety of two-dimensional circuits, as in conventional microprocessors. Thus it offers the possibility of scaling up from one device to a large quantum circuit – a necessary criterion for making a useful quantum computer. However, this is not the only approach and many groups around the world are working on ways of implementing this exciting new technology. The field remains wide open and the next few years promise substantial developments. Although quantum computers are currently in the early stages of development, small model systems have been demonstrated and the potential power and number of applications remains huge. Indeed, many more applications may become apparent once larger working models have been built and more algorithms have been tested. Although the origins of this discipline are in highly specialised theoretical physics, the potential applications may have a dramatic impact on everyone from physicists to pharmaceutical chemists. David Williams is chief research scientist and laboratory manager with Hitachi Cambridge Laboratory.