Primer - Quantum Computing
/ Chris Collins / 12.10.2018 /
Continuing the Primer Series, I wanted to cover an emerging wave in the computer processing capability that could have significant ramifications across industries. In short, quantum computing will allow machines to perform advanced calculations at a fraction of the time it takes current computing technology.

In traditional computing, data is composed of zeros (0) and ones (1), which are called bits. This data is processed to produce an output. Computers today use these zeros and ones to process math computations, encryption, graphics, etc. While they do these tasks very well, they are limited in their ability to conduct complex computations. Computers depend on conventional transistors and can only improve computational ability to the extent that we can fit more transistors onto a chip by minimizing each one. Chipmakers are beginning to hit the limit to which they can do this, likely marking the end of Moore’s Law.
Quantum Mechanics
In order to gain a better foundation for understanding how quantum computing works, we need to gain an understanding of what atomic particles are and how they work. Quantum mechanics refers to how particles behave at the atomic and subatomic level in relation to energy. At the atomic scale, the classic laws of physics don’t apply – to whet your appetite, below are a couple revelations by physicists over the years that defy conventional knowledge:
Quantum entanglement – When particles physically interact and remain connected so that actions performed on one affect the other, even when separated by great distances. This potentially has great implications for the future of encryption.
Quantum leaps – The abrupt movement from one energy level to another at the subatomic level. The amount of energy absorbed or emitted at this level cannot be subdivided and is called quanta.
Wave-particle duality – All matter exhibits both wave and particle properties. The idea that something can be two things at once is central to quantum computing.
These findings and thought experiments have led the scientific community to challenge and disprove a lot of the theorems proposed by classical physics. Now imagine if computers could leverage quantum physics to do things conventional computers can’t….
Quantum Computing
Wikipedia’s formal definition is:
“Quantum computing is computing using quantum-mechanical phenomena, such as superposition and entanglement.”
The breakthrough that quantum computing achieves by leveraging quantum-mechanical phenomena is called a quantum bit (or qubit), which is the quantum version of a bit and can represent a zero, one, both at the same time, or an infinite number of values in between.

Instead of working on computations one at at in a sequence, quantum computers can do multiple things at the same time and it is estimated that they could be millions of times faster than computers today. Since there are no fully functional quantum computers today (and there likely won’t be for many years or decades), researchers are only confident in one use case so far, factoring numbers into their prime factors at a much greater speed. This could break even the strongest encryption algorithm in place today, as demonstrated by Schor’s algorithm. (For those wondering what happens to cryptonetworks in this case, researchers predict that the technology will also lead to stronger forms of encryption which could be deployed as a response).
While we are likely far off from seeing quantum computers in action at a large scale, there has been significant progress in the field recently. In the past year, Microsoft unveiled its quantum development kit for quantum applications, D-Wave Systems released a development kit for hybrid quantum-classical applications, and researchers developed the first optical microchip, which is a necessary component for practical quantum computers.