State of Quantum Computing in 2021 EOY for Business Leaders

Spare yourself the trouble and delay learning anything about quantum computing until 2021 eoy unless you are working on:

  • a problem that is not solvable in reasonable time with current computers (e.g. building deep artificial neural networks with millions of layers or simulating molecular interactions). Such problems are common and almost all Fortune 500 companies could benefit from quantum computers
  • Cryptography, or at an intelligence agency or need to transmit nation or mega corporation level secrets
  • quantum computing (sorry had to be MECE)

If you are in one of these fields, quantum computing has the possibility to transform your field in a few years. If not, check back in 2022, technology may have progressed to the point that others may also need to learn about quantum computing.

As non-technical corporate leader, what should I do about quantum computing?

If you are working on cryptography, or at an intelligence agency or need to transmit nation or mega corporation level secrets, stop relying on cryptographic methods that rely on factoring large integers. There are already quantum-ready alternatives as we discuss in the use cases section.

If you are a problem that is not solvable in reasonable time with current computers, start exploring quantum computing. This article will explain you quantum computing and explain the ecosystem so you can find partner companies who can help you explore how quantum computing can solve your computing challenges. Do not expect immediate results though. Even though quantum computing companies cite many large companies among their customers, these tend to be PoCs with limited scope. Quantum supremacy (quantum computing being superior to classical computing) has not yet been proven for any practical application.

What is Quantum Computing?

Quantum computing involves non-classical computation which is proved to be superior to classical computation in certain important problems such as factoring large integers.

Classical computation is the foundation of almost all computing done by machines today. This includes all computing such as cloud computing, computing on your desktop, mobile or wearable device. I used the phrase “almost all” because there are also quantum computing experiments performed by researchers which should also be classified as computing. Classical computation relies on deterministic bits which have observable states of 0 or 1. As with classical (or Newtonian) physics, this is a pretty intuitive and practical approach. However, it is not efficient at modelling quantum phenomena or dealing with probabilities.

As you may remember from your high school years, Newton’s formulas are quite accurate for macro particles moving at speeds that are significantly slower than the speed of light. However, such a classical (or Newtonian) view of the world is not accurate at the molecule level or at speeds close the speed of light. This is because all matter display wave-like properties and are non-deterministic. Modeling them with bits that also display wave-like properties is more efficient.

Capability to model phenomena about molecules or particles moving close to the speed of light may seem interesting but not so useful. However, it is extremely useful:

  • Modelling molecule level interactions can unlock new insights in chemistry, biology and healthcare.
  • Quantum computing is effective at modelling probabilities and permutations as quantum mechanics is non-deterministic, certainty in classical physics is replaced with probabilities. This allows quantum computers to break RSA, possibly the most widely used cryptographic method. For example, you rely on RSA when you rely on as you transmit your credit card information to an online merchant.

For a more visual representation of how a photonic quantum computer works, you can check out this video from one of the vendors. There various approaches to building quantum computers and photonic is one approach:

 

Why is quantum computing relevant now?

Quantum computers are hypothesized to be superior to classical computation in certain important problems. Recent developments suggest that this could become the reality even though timelines for scientific progress are hard to estimate.

  • There have been significant scientific advances in the field:
    • Google recently claimed to prove quantum supremacy. The benchmark was simulating the outcome of a random sequence of logic gates on a quantum computer which is the task where quantum computers can possibly have the largest advantage over classical computing. While this is not a commercially useful computation, the fact that a quantum computer surpassed state of the art classical computers is still an important milestone.
    • In 2015, Google claimed that quantum annealers have performed orders of magnitude more efficiently in some optimization problems when compared to simulated annealers using classical computing
  • Quantum computing power grows exponentially with each qubit rather than linearly as in the case of linear bits due to the multi state computational capability of qubits. For example, a quantum computer with 1 qubit can simulate 2 classical bits, 2 qubits can simulate 4 classical bits, 3 bits can simulate 8 classical bits etc. This makes exponential quantum computing power growth feasible.
  • There is significant investment in this space:
    • Most mega tech companies such as Fujitsu, Google and IBM have been investing in quantum computing. In addition, startups such as D-Wave Systems raised >$200m to tackle the problem.
    • Number of qubits in quantum computers have been increasing dramatically from 2 qubits in 1998 to 128 qubits in 2018

How does it work?

Quantum computing allows developers to leverage laws of quantum mechanics such as quantum superposition and quantum entanglement to compute solutions to certain important problems faster than classical computers. As usual, we kept this as simple as possible, however keeping this simple was the hardest how it works section we have ever written!

Qubits, bits in quantum computers, have 2 advantages over classical bits: They can hold more than one state during computation and two qubits can be entangled (i.e. set to the same state regardless of their location).

Classical computing relies on bits for memory. Bits can either be in the 0 or 1 state. Typically, this is physically represented as voltage on physical bits.

Qubit is the name for memory in quantum computers. Just like bits in classical computing, they can be observed in two states: 0 and 1. However, they are subject to quantum mechanics and when they are not observed, they have a probability of 0 and 1. These probabilities (probability amplitudes to be precise), can be negative, positive or complex numbers and are added up “superimposed”. This is like adding waves in classical physics and allows a single qubit to be capable of holding 2 bits of information.

Other advantage of qubits is quantum entanglement which sets a group of qubits to the same state and qubits retain this equivalence until they are disentangled.

Though qubits are by nature probabilistic, they return a classical, single state when measured. Therefore in most quantum computers, a series of quantum operations are performed before the measurement. Since measurement reduces a probabilistic result to a deterministic one, numerous computations are required to understand the actual probabilistic result of the quantum computer in most cases.

Qubits can be implemented using various quantum phenomena and this is an area of active research and no mature solutions exist. Different quantum computers use different qubit implementations.

What are its potential use cases/applications?

Primary applications include optimization & research in various industries, cryptography and espionage. Feel free to read our article on quantum computing applications for more.

How will quantum computing change AI?

Quantum computing and AI are two of the most hyped tech of today. Combining them naturally raises sceptical eye brows as both fields have numerous sceptics doubting their potential. Sceptics are correct in that quantum computing is still a field of research and it is a long way from being applied to neural networks. However, in a decade, AI could run into another plateau due to insufficient computing power and quantum computing could rise to help the advance of AI.

-----------------------------------------------------------------------------
Source: https://research.aimultiple.com/quantum-computing/