When asked, what technologies do you believe will have the biggest impact on the future? I generally respond with the following three:

  • Artificial Intelligence (AI)
  • Blockchain
  • Quantum Computing

Thanks to science fiction movies, Artificial Intelligence (AI) is probably the most well known, although still largely misunderstood. Blockchain and Quantum Computing are more obscure and also more difficult to understand, as they are less visible to the consumer.

I have previously written about Blockchain (and Cryptocurrency, therefore this article will focus on Quantum Computing.

What is Quantum Computing?

To understand Quantum Computing, you must first understand the basics of Classical Computing, which is the name given to our current computing architecture.

Classical Computers leverage transistors, which are semiconductor devices used to amplify or switch electronic signals.

Processors have a transistor count, which is the number of transistors on an integrated circuit. For example, an Intel Core i7 Skylake has 1,750,000,000 transistors.

The rate at which transistor counts have increased generally follows Moore’s law, which observes that transistor count doubles approximately every two years (although this is arguably a self-fulfilling prophecy made real by companies such as Intel, AMD, and ARM).

To achieve this growth, transistors must continue to get smaller. For example, Intel Core i7 Skylake is built using a 14nm process, which is 500 times smaller than a red blood cell.

The International Technology Roadmap for Semiconductors (ITRS) expects to see 10nm nodes by 2017, however many experts believe the roadmap will end after 5nm (expected in 2021), where the transistors will become so small that they cease to operate as expected, resulting in a phenomenon known a quantum tunnelling (where electrons pass through a block package).

At this point, we would be operating at the quantum level, where physics work differently. Therefore, scientists are trying to use these unusual quantum properties to their advantage by building Quantum Computers.

How does Quantum Computing work?

In Classical Computing, a basic unit of information is called a bit, which is a binary digit that can have only one of two states, commonly represented as either a 0 or 1.

Quantum Computers use Qubits, which can also be set to two states, however, a Qubit can be in both states simultaneously, this is known as a superposition.

As a result, with 3 qubits of data, a Quantum Computer can store all eight possible combinations of 0 and 1 simultaneously (000, 001, 011, 111, 110, 100, 101, 010). Therefore, a 3 Qubit based Quantum Computer can complete calculations eight times faster than a 3-bit Classical Computer.

Another key property is entanglement, which describes the close connection between multiple Qubits, allowing a Qubit to instantly react to the change of another Qubit, regardless of their relative proximity. As a result, by reading the state of one Qubit, the state of its partners can be deduced without any additional checks.

These special properties help Quantum Computers solve complex problems much faster than Classical Computers.


Quantum Computing is incredibly exciting, but also very complex. The description outlined in this article has been dramatically simplified, but hopefully provides enough context to highlight the key differences between Classical Computing and Quantum Computing.

In short, Quantum Computing has the potential to solve certain problems that are today considered impossible, specifically targeting Machine Learning, Healthcare (Quantum Simulations) and IT Security.