How Will Quantum Computing Change The World?
Firstly, I would like to explain what the concept of ‘Quantum Computing’ is.
What is quantum computing?
Quantum computing takes advantage of the ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.
In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.
Quantum computing is computing using quantum-mechanical phenomena, such as superposition and entanglement.
Now let’s break these two concepts down:
Now what exactly is superposition?
Superposition is the ability of a quantum system to be in multiple states at the same time until it is measured.
Because the concept is difficult to understand, this essential principle of quantum mechanics is often illustrated by an experiment carried out in 1801 by the English physicist, Thomas Young. Young’s double-slit experiment was intended to prove that light consists of waves. Today, the experiment is used to help people understand the way that electrons can act like waves and create interference patterns.
For this experiment, a beam of light is aimed at a barrier with two vertical slits. The light passes through the slits and the resulting pattern is recorded on a photographic plate. When one slit is covered, the pattern is what would be expected: a single line of light, aligned with whichever slit is open.
Intuitively, one would expect that if both slits are open, the pattern of light will reflect two lines of light aligned with the slits. In fact, what happens is that the photographic plate separates into multiple lines of lightness and darkness in varying degrees.
What is being illustrated by this result is that interference is taking place between the waves going through the slits, in what, seemingly, should be two non-crossing trajectories. Each photon not only goes through both slits; it simultaneously takes every possible trajectory on route to the photographic plate.
Quantum entanglement is a physical phenomenon which occurs when pairs or groups of particles are generated or interact in ways such that the quantum state of each particle cannot be described independently of the state of the other(s), even when the particles are separated by a large distance—instead, a quantum state must be described for the system as a whole.
Now that I have explained these two phenomena, I can now show how quantum computing will change the world.
Firstly, we need to clear a misconception about quantum computing, the misconception is that we have become so accustomed to advances in computing being reflected in slimmer, faster laptops and bigger memories that quantum computing is often envisaged in the same terms. It shouldn’t be.
Digital computers manipulate information encoded in binary form as sequences of ones and zeros the rest is software, whether that involves converting keystrokes or mouse movements into images, or taking numbers and feeding them into an equation to work out the answer.
Quantum computers are no different, except in one crucial respect. In a conventional computer, one bit of binary data can have one of just two values: one or zero. But in a quantum computer, these switches, called quantum bits have more options, because they are governed by the laws of quantum theory.
Thanks to superposition, qubits can, in effect, encode one and zero at the same time. As a result, quantum computers can represent many more possible states of binary ones and zeros. A classical bit can represent two states: zero and one. Add a bit to your computer’s processor and you can encode one more piece of binary information. Yet if a group of qubits are placed in a joint superposition, called an entangled state, each additional qubit doubles the encoding capacity. By the time you get to 300 qubits – as opposed to the billions of classical bits in the dense ranks of transistors in your laptop’s microprocessors – you have 2 options. That’s more than the number of atoms in the known universe.
Quantum computers have largely been advertised on the promise that they will be vastly faster at crunching through calculations than even the most powerful of today’s supercomputers. This speed-up – immensely attractive to scientists and analysts solving complex equations or handling massive data sets – was made explicit in 1994 when the American mathematician Peter Shor showed in theory that a computer juggling coherent qubits would be able to factor large numbers much more efficiently than classical computers. Reducing numbers to their simplest factors – decomposing 12 to “two times two times three”, for example – is an exercise in elementary arithmetic, yet it becomes extremely hard for large numbers because there’s no shortcut to trying out all the possible factors in turn. Factorising a 300-digit number would take current supercomputers hundreds of thousands of years, working flat out.
For this reason, a lot of data encryption – such as when your credit card details are sent to verify an online purchase – uses codes based on factors of large numbers, which no known computer can crack. Yet Shor showed that a quantum factorisation algorithm could find factors much more efficiently than a classical one can. As well as factorisation, quantum computation should be able to speed up database searches – and there’s no question how useful that would be, for example in combing through the masses of data generated in biomedical research on genomes.
One of the likely first big applications of quantum computing isn’t going to set the world of personal computing alight, but it could transform an important area of basic science. Computers operating with quantum rules were first proposed in 1982 by the American physicist Richard Feynman. He wasn’t concerned with speeding up computers, but with improving scientists’ ability to predict how atoms, molecules and materials behave using computer simulations. Atoms observe quantum rules, but classical computers can only approximate these in cumbersome ways: predicting the properties of a large drug molecule accurately, for example, requires a state-of-the-art supercomputer.
Quantum computers could hugely reduce the time and cost of these calculations. In September, researchers at IBM used the company’s prototype quantum computer to simulate a small molecule called beryllium dihydride. A classical computer could, it’s true, do that job without much trouble – but the quantum computer doing it had just six qubits. With 50 or so qubits, these devices would already be able to do things beyond the means of classical computers.
To conclude, I this essay I have outlined what a quantum computer is and how it can change the world. The last brief point I would like to address is that though a quantum computer is on the way there are many questions on how long it could take.
another big difficulty is dealing with errors. Given the difficulty of keeping qubits coherent and stable, these seem inevitable: qubits are sure to flip accidently now and again, such as a one changing to a zero or getting randomised. Dealing with errors in classical computers is straightforward: you just keep several copies of the same data, so that faulty bits show up as the odd one out. But this approach won’t work for quantum computing, because it’s a fundamental and deep property of quantum mechanics that making copies of unknown quantum states (such as the states of qubits over the course of a computation) is impossible. Developing methods for handling quantum errors has kept an army of researchers busy over the past two decades. It can be done, but a single error-resistant qubit will need to be made from many individual physical qubits, placing even more demands on the engineering.
You can only access the opportunities that the quantum computer holds, if all the qubits are mutually dependent: in a collective or “coherent” state, which, crudely speaking, means that if we do something to one of them (say, flip a one to a zero), all the others “feel” it. Generally, this requires all the qubits to be placed and maintained in an entangled state.
The difficulty of making a quantum computer mostly involves making and sustaining these consistent states of many qubits. Quantum effects such as superposition and entanglement are delicate and easily disrupted. The jangling atomic motions caused by heat can wash them away. So, to be consistently entangled, qubits must be cooled to extremely low temperatures – we’re typically talking less than a degree above absolute zero (-273° C) – and kept well isolated from the laboratory environment: that is, from the very equipment used to manipulate and measure them. That’s partly why the IBM quantum computer I saw is so bulky: much of it consists of cooling equipment and insulation from the lab environment.