News > i3

The Rush to Develop Quantum Computers

The tech industry often uses buzzwords like “big data” and “the cloud.” Unfortunately, these terms are often not clear to people outside of the engineering community. “Quantum computing” is one of those terms.

Digital electronic computers are based on transistors and in simple terms, are very fast calculators that use a sequence of bits — values of 0 and 1 representing on and off states in semiconductor switches — following a prearranged set of instructions. Traditionally, computers do one thing at a time even if they are operating in different computing modes.

Then there’s “quantum,” which refers to quantum mechanics: the study of how subatomic particles move and interact — where traditional laws of physics no longer strictly apply. Instead, these particles can exist in more than one state at a time (a phenomenon known as superposition).

Instead of bits, which conventional computers use, a quantum computer employs quantum bits, or qubits. While a normal bit can only be a 0 or a 1, a qubit can achieve a mixed state and become simultaneously 0 and 1. This means a computer using qubits can store an enormous amount of information while using far less energy. Better still, as you add qubits to the system, the power of the computer grows exponentially — each additional qubit doubles a quantum computer’s computing power.

Unlike standard computers, quantum computers don’t examine all possible solutions to a problem. Instead, they use algorithms to eliminate paths leading to wrong answers. But these algorithms only work for certain problems. One of these specific tasks is factoring very large numbers. "Factors" are the numbers you multiply to get another number. For instance, factors of 15 are 3 and 5. Twelve can be factored as 1×12, 2×6, or 3×4. Easy, yes? Well, not so easy when the number has 20 digits.

If you are familiar with cryptography, you might believe these rose-tinted algorithms are more like red flags. Some important cryptographic algorithms, such as RSA and others, critically depend on the fact that prime factorization of large numbers takes a long time for an ordinary computer to “crack” by crunching numbers.

To cope, the National Security Agency (NSA) is planning to transition to quantum resistant algorithms and expects U.S. national security vendors to begin overhauling their encryption to guard against the threat posed by quantum computers.

But don’t reach for the antacids just yet. Building a large-scale, accurate quantum computer is about as complex as an engineering feat can get. Because particles lose superposition with the slightest interference, quantum computers must be totally isolated from the outside world; a very slight temperature change, noise or vibration will knock them out of the superposition state. Experts say it will be 2030 before quantum computers make today’s encrypted data vulnerable.

Positive Applications

In pharmaceutical research, where the interactions between proteins and chemical molecules determine whether medicines will cure disease, quantum computers could analyze the huge amount of combinations needed to make progress. Since quantum computers can review multiple elements simultaneously, researchers can determine viable drug options more quickly.

Artificial intelligence (AI) is also well suited to quantum computing, since the more input you give a computer program, the more accurate it becomes. Quantum computers can rapidly analyze enormous quantities of data so they can significantly shorten the AI learning curve.

Other potential applications in the quantum computing sweet spot include weather forecasting and climate change tracking. And with traffic control for self-driving cars, qubit-based computers could scan massive databases concurrently to quickly calculate optimal routes to allow the most efficient scheduling and reduce traffic congestion.

Murray Slovick