How do quantum computers work and what are they used for?

Author: Richard Flint, Research assistant, RAND Europe

How does a quantum computer work?

The basic building blocks of a quantum computer are fundamentally different to their classical counterparts. Whereas a classical computer relies on the ‘on’ or ‘off’ of an electrical current, quantum computers rely on particular properties of microscopic particles that are observed to be in only one of two possible states. Common examples include the direction of oscillation of a light wave (‘light polarisation’) or the orientation of an electron in a magnetic field (‘electron spin’).

These properties exist at the quantum level, but they are similar – at least at a basic level - to properties of real-world objects that are also binary in nature, such as a cat that can be either dead or alive, or a light bulb that can be either on or off. Both quantum and macroscopic properties can exist that take one of two values, and thus be used to represent the values ‘1’ or ‘0’ in an information system.

There are, however, important differences between the properties of microscopic particles and these macroscopic equivalents. In the macroscopic world, the properties of objects exist in one state or another, but not both at the same time. A cat, for example, can be either alive or dead, but it cannot be both alive and dead at the same time (and a nearly dead cat still counts as alive!). Similarly, a light bulb can be either on or off, but it cannot be both on and off at the same time.

However, in the microscopic world of individual particles, this exclusivity of states is no longer maintained, and two binary properties can exist in a so-called ‘quantum superposition of states’. Translated to their macroscopic equivalents, in the quantum world a cat could be – at least mathematically speaking – both alive and dead at the same time, and similarly, a light bulb could be both on and off simultaneously. This is not the same as a cat that is nearly dead or a light bulb that is warming up; rather, it is two different states existing simultaneously at precisely the same time.

Quantum superposition is a strange phenomenon, but it is this characteristic of the quantum world – together with an even stranger phenomenon of quantum entanglement – that fundamentally changes the nature of information stored and used by a quantum computer. Whereas a classical ‘bit’ can only take one of two possible values at any one time, a quantum bit (also known as a ‘qubit’) exists as both a 1 and a 0 simultaneously until the qubit is measured. In a simplistic way, a qubit can be considered to store twice as much information as a classical bit, as it can represent two values at once.

Moreover, when two or more qubits are brought together into a single system, these binary states can all exist simultaneously in what becomes an increasingly large number of simultaneous possibilities. If two qubits are bought together, for example, then they can simultaneously exist in all four possible states at the same time (0 and 0,0 and 1,1 and 0, and 1 and 1), whereas a classical pair of ‘bits’ can only be any one of these states (0 and 0,0 and 1,1 and 0, or 1 and 1).

And why is this useful?

A quantum computer based on qubits – as opposed to classical bits - is able to store and manipulate significantly more information than its classical counterpart. Moreover, as the number of qubits increases, the number of simultaneous states becomes exponentially large according to 2N’ (‘2 to the power N’). For a classical computer to store the same amount of information as a quantum computer, each of these states must be recorded as a separate piece of memory, which becomes inhibitingly large after only a relatively small number of qubits.

A quantum computer based on just 10 particles, for example, can store approximately 210 different states at once, whereas a classical computer requires approximately 65,000 classical bits (0.008 megabytes) of memory to store the same amount of information. This is possible even for a basic desktop computer, but by 27 particles, around 1 gigabytes of classical memory is required; by 40 particles, several terabytes of memory is required; and at 70 particles, several billion terabytes of classical memory are required to store the same amount of information as the quantum computer.

There is, however, a slight catch. The information in a quantum computer is not accessible in the same way as information stored in a classical computer. Even though a small number of qubits can represent an enormous number of different states, only one of these states can be ‘observed’ or ‘accessed’ at any one time, and it is impossible to predict which state will be observed until this information itself is received.

This probabilistic nature of quantum computing means they are unlikely – at least in the near future - to match classical computers in the majority of calculations that we currently perform. They are, however, particularly good at performing certain types of tasks where a large number of possibilities must be modelled simultaneously.

Perhaps the most obvious example is the simulation of quantum systems themselves, which at present require huge amounts of classical computing power to model even the most basic systems. A group of electrons, for example, is impossible to model after around 50 or so particles, but a quantum computer could – at least in theory – model this system on a one-to-one basis.

This has applications not only in the abstract realms of physics, but also in tangible areas such modelling the dynamics of molecules used in advanced drug discovery, and understanding the mechanisms behind photosynthesis in plants, which may be used to capture and store energy more efficiently from both natural and artificial light.

At a policy level, more immediate concerns stem from the ability of quantum computers to solve so-called quantum algorithms, which are similar in design to their classical counterparts except that they manipulate large matrices of information as opposed to single numerical inputs. Whereas a classical computer requires a huge amount of computing power to use these matrices, a quantum system is able to store and manipulate this information using the physical properties of particles themselves.

Quantum algorithms have a range of positive applications, such as quickly searching arbitrarily large and unstructured databases, finding the shortest route between cities, and developing more efficient machine learning and artificial intelligence systems. There are, however, more negative applications of quantum algorithms. In particular, Shor’s algorithm is able to quickly factorise large numbers and break the encryption that currently protects the majority of the world’s data, including anything from credit card and financial systems to state secrets.

Although it is expected to be at least 10 years until a sufficiently powerful quantum computer is developed that poses a real risk to information security, both governments agencies and private companies alike are racing to develop new forms of quantum cryptography that protect against quantum algorithms such as Shor’s algorithm.

Yet, with teams at GoogleIBMMicrosoftIntel and others all attempting to build the first the first quantum computer to achieve the so-called quantum supremacy, it is important for policy makers to think ahead now and prepare for the significant changes that may arise from the development of this new technology.