Sunday, February 7, 2010

Quantum Computing
31 Jul 2007 By Dr. Tom Trevethan Page 1 of 2The past 60 years have seen a phenomenal growth in the power of information technology, with almost every aspect of our lives now reliant upon some form of micro-processor.
Our appetite for more computational speed seems to have become insatiable as we find new ways to use information processing to improve the quality of our lives. However, there are problems on the road ahead: the technology foundation on which all computers are built, semiconductor-based integrated circuits, will soon reach the physical limits of the speed at which they can perform operations. It seems the laws of nature are going to get in the way of the relentless increases in computational power that we now demand.


All modern computers are built around microprocessors that consist of many thousands of transistors (basically electronic switches) built into a small integrated circuit on a silicon chip. Essentially, the more transistors you can fit onto a chip, and the smaller they are, the faster and more powerful the processor. In 1965 Gordon Moore, one of the founders of Intel, predicted that the complexity of integrated circuits, i.e. the number of transistors built onto a single chip, would approximately double every two years, leading to an exponential growth in computational power. Even though this prediction was made over 40 years ago, when this technology was in its infancy, Moore's law has proven surprisingly accurate - however this cannot continue indefinitely. The latest generation of PC processors (and the one that you are probably using right now) have a smallest feature size of 65 nm (nanometers, or billionths of a meter) and a transistor density of hundreds of millions per square mm. Over the next few years, the smallest feature size is expected to drop further, and performance increase proportionally – however, both feature size and transistor density are now getting close to the atomic scale, which imposes a fundamental limit on how small we can go. When semiconductor devices reach the scale of a few nanometers, they will not function in the same way that they do at larger scales, since quantum mechanical effects will start to influence their properties. Essentially, transistors will no longer be capable of functioning as switches due to insulating materials being so thin that the flow of electrons cannot be prevented.

So what are the alternatives to the silicon chip? Whatever is eventually developed to replace semiconductor processors, the architecture will have to change significantly. One possibility that is now being seriously considered is to use individual molecules to form the basic components of a processor. A single small molecule can act as a transistor or even an entire logic gate, and could replace silicon transistors as the building blocks of a computational device. Single molecules would be able to perform an operation much faster than a semiconductor device and also be much smaller than the equivalent semiconductor device minimum possible size. This new field of 'molecular electronics' is growing rapidly and relies on many developments in chemistry as well as in physics and materials science. The outlook is promising, however there are many technical obstacles to overcome before an electronic processor composed of molecular components can be fully realised, such as how to connect the moelcules together and have complete control over the stucture of the device.
Another, quite different, approach has been developed that also employs molecules to perform a computational function. Reactant molecules can be encoded with input information and then a computaion is performed through a chemical reaction, revealing its result in the product molecules. This has been implemented successfuly in a 'DNA computer' to solve several complex problems where the input information is encoded in DNA sequences and a reaction produces the output sequence. This reaction happens very quickly and can process a lot of information, however encoding the sequences and harvesting the result is an extremely labourious task that requires many stages of chemical analysis.


A much more exciting and ambitious idea is to use the actual quantum mechanical properties of a system to perform computational functions: to build a 'Quantum computer'. The use of quantum mechanical phenomena in this way leads to a fundamentally different logic to that employed by 'classical' computers – and promises to totally change the whole paradigm of information processing.

The basic principle of a quantum computer is that the fundamental entity (the bit in a classical computer) can not only exist in two definite states (1 or 0 - on or off) but also in a quantum superposition of them (i.e. both states at the same time) – this is the so-called quantum bit (or 'qubit'). When the system is observed (i.e. it interacts with the external world) it will 'collapse' into one of the two distinct states.

A collection of qubits, forming a register in a quantum computer, can exist in a superposition of all the possible states of the qubit register. For example, a register of eight binary qubits (a qubyte) has 2^8 = 256 distinct states. Performing a logical operation on the superposition of these states effectively performs the operation on all of the possible states simultaneously. This allows a quantum computer to perform many computations in parallel.

The properties of qubits and the quantum computers that could be formed from them seem very strange, and have led to the development of totally new algorithms and effectively a new form of mathematics. We have still not fully appreciated the capabilities of a working quantum computer and the problems it could be used to solve, but one example of its power could be in the modelling of large quantum mechanical systems such as biological molecules and nano-structures, which at present require huge computational power to run the models and calculate the properties of the simulated system. Simulations such as these are crucial to many areas of science, including chemistry, biology and medicine.

Although research into the development of a quantum computer is a very active area and many different potential technological solutions are now being investigated, there are significant practical difficulties in realising the goal. The greatest of the problems facing researchers is that the qubits in a register must be able to interact with each other, but also be totally isolated from everything else, otherwise the superposition would collapse (through a process called decoherence). Another problem is that each qubit must be addressed individually in order to exchange information with the computer, and since the qubits must usually be single atoms, electrons or photons managing an exchange of information becomes difficult due to the requirement of applying a small force to a precise location. Of the working quantum computers developed so far, demonstrations of quantum algorithms have only been possible using a very small number of qubits. According to David Deutsch, the pioneer of the concept of quantum computing, many hundreds of interacting qubits would be required for a 'useful' quantum computer that could tackle currently unsolvable problems.

It is very difficult to predict what will eventually replace the silicon microchip, and how our computers will work in 20 or 30 years time. But one thing is fore sure, we will never be satisfied with the computational power on offer and always be craving more.