When I started this blog, my intention was to document my own attempts to learn and understand some of the topics at the intersection of computer science, physics and mathematics that have the potential to change the world in which we work and live. After spending some time with one of these topics – machine learning and artificial intelligence – I have now started to look into one of the most exciting combinations of these subjects – quantum computing.
Roughly speaking, quantum computing refers to leveraging the laws of quantum mechanics – like superposition and entanglement – to build computing devices that operate in a way fundamentally different from traditional computers and allow us to solve traditionally hard computational problems in a reasonable amount of time.
How exactly this works is difficult to explain without referring to nontrivial quantum mechanics and mathematics. It is often said that quantum computing is exploiting the superposition principle – the fact that the state of a quantum system can be described as a superposition of many different fundamental states at the same time – to perform many calculations parallel in one step that a classical computer can only perform serially. However, we will see that this is a bit of an oversimplification as it ignores the fact that a measurement destroys this parallelism.
In addition, even though a quantum computer is theoretically able to perform any calculation that a classical computer can do, it is not clear that we will see a drastic speedup for arbitrarily chosen classically hard problems. In fact, as of today, we know a few problems – like the factorization of large composite numbers into primes – that can be solved using quantum computers while being intractable for traditional hardware, and we know of some areas of applications like simulations or optimization and machine learning, where we have good reasons to expect that quantum computing will provide a massive gain in speed and capacity, but there is strong evidence that not all classically hard problems can be solved efficiently on a quantum computer (see for instance [1]). And sometimes, we find a fast quantum algorithm for something that so far could not be done efficiently on a classical computer, but this algorithm then inspires a classical solution providing the same speedup (the recommendation problem seems to be an example for that fruitful interaction between quantum computing and classical computer science).
So even a fully operational universal quantum computer would not out of a sudden make classical computing obsolete and perform better in all known instances. Rather, we should expect that at least in the near future, the main applications of quantum computing will be restricted to a few dedicated fields, with classical computing evolving in parallel (see for instance [2] for a discussion).
This does not make quantum computing less exciting. After all, quantum computing changes the way how we think about things like information and algorithms substantially. However, implementing a working quantum computer remains hard. In later posts, we will look at some of the technologies like quantzum error correction, superconducting devices and ion traps that are currently used by researchers around the globe to build quantum computing devices at scale. We have seen some very exciting progress recently, with e.g. Google announcing its new Bristlecone device with 72 qubits or IBM making a 5 qubit quantum computer publicly accessible in the cloud as part of their Q Experience program. There are even simulators like Quirk that allow you to put together quantum circuits online and play with the results or frameworks like Microsofts quantum development kit or the API and framework developed by 1QBit that support the development and debugging of quantum algorithms before trying them on a real quantum device. So it is definitely a good point in time to get to know some of the basics of quantum computing, covering theory, applications and implementation – and this is the goal of this new series.
Currently, my plans are to cover the following topics in the upcoming posts.

 Fundamentals: qubits and Hilbert spaces
 Quantum gates and unitary transformations
 Grover’s search algorithm
 Quantum Fourier transform
 Shors algorithm
 The quantum variational eigensolver
 Simulation
 Error correction – basics, fault tolerant quantum computing, stabilizer codes, toric codes and surface codes
 Superconducting quantum devices
 NMR based quantum computers
 Playing with IBMs Q experience – first steps, the Python API, the DeutschJozsa algorithm and the Quantum Fourier Transform
 Factoring integers using Shor’s algorithm and IBMs Q experience
So in the next post, I will set out explaining what a qubit is and how a qubit is described in the language of quantum mechanics and mathematics. Until then, I recommend the links below which provide some selected reading on applications and limitations of quantum computing as of today.
1. C.H. Bennet, E. Bernstein, G. Brassard, U. Vazirani, Strengths and Weaknesses of Quantum Computing, arXiv:quantph/9701001v1
2. J. Preskill, Quantum Computing in the NISQ era and beyond, arXiv:1801.00862
3. Finding Optimal Arbitrage Opportunities Using a Quantum Annealer, http://www.quantumforquants.org/quantumcomputing/qaarbitrage/
4. The excellent Wikipedia article on quantum machine learning
5. Job One for Quantum Computers: Boost Artificial Intelligence
6. The Era of Quantum Computing Is Here. Outlook: Cloudy
Really well explained article
LikeLike