Running the Deutsch-Jozsa algorithm on IBMs Q experience

In one of the previous posts, we have looked at the basics of the Qiskit package that allows us to create and run quantum algorithms in Python. In this post, we will apply this to model and execute a real quantum algorithm - the Deutsch-Jozsa algorithm. Recall that the Deutsch-Jozsa algorithm is designed to solve … Continue reading Running the Deutsch-Jozsa algorithm on IBMs Q experience

Using Python to access IBMs quantum computers

In a previous post, we have looked at IBMs Q experience and the graphical composer that you can use to build simple circuits and run them on the IBM hardware. Alternatively, the quantum hardware can be addressed using an API and a Python library called Qiskit which we investigate in this post. Installation and setup … Continue reading Using Python to access IBMs quantum computers

Shor’s quantum factoring algorithm

Until the nineties of the last century, quantum computing seemed to be an interesting theoretical possibility, but it was far from clear whether it could be useful to tackle computationally hard problems with high relevance for actual complications. This changed dramatically in 1994, when the mathematician P. Shor announced a quantum algorithm that could efficiently … Continue reading Shor’s quantum factoring algorithm

Grover’s algorithm – unstructured search with a quantum computer

In the last post, we have looked at the Deutsch-Jozsa algorithm that is considered to be the first example of a quantum algorithm that is structurally more efficient than any classical algorithm can probably be. However, the problem solved by the algorithm is rather special. This does, of course, raise the question whether a similar … Continue reading Grover’s algorithm – unstructured search with a quantum computer

The EM algorithm and Gaussian mixture models – part II

In this post, I will discuss the general form of the EM algorithm to obtain a maximum likelihood estimator for a model with latent variables. First, let us describe our model. We suppose that we are given some joint distribution of a random variable X (the observed variables) and and random variable Z (the latent … Continue reading The EM algorithm and Gaussian mixture models – part II

The EM algorithm and Gaussian mixture models – part I

In the last few posts on machine learning, we have looked in detail at restricted Boltzmann machines. RBMs are a prime example for unsupervised learning - they learn a given distribution and are able to extract features from a data set, without the need to label the data upfront. However, there are of course many … Continue reading The EM algorithm and Gaussian mixture models – part I