In the previous blog post we looked at what a Mixture Density Network is with an implementation in TensorFlow. We then used this to learn the distance to galaxies on a simulated data set. In this blog post we'll show an easier way to code up an MDN by combining the power of three python libraries.

You are likely familiar with number 2 and 3 so let me tell you a bit about the first. Edward is a python library for probabilistic modelling, inference, and criticism. It's goal it to fuse the related areas of Bayesian Statistics, Machine Learning, Deep Learning and Probabilistic Programming. Edward is developed by the group of David Blei at Columbia University with the main developer being Dustin Tran. The example we discuss here is based on the example in the Edward repo that was written by Dustin and myself.

Edward implements many probability distribution functions that are TensorFlow compatible, this makes it attractive to use for MDN's. In the previous blog post we had to roll our own $Beta$ distribution, with Edward this is no longer necessary. Keep in mind, if you want to use Keras and TensorFlow like we will do in this post you need to set the backend of Keras to TensorFlow, here it is explained how to do that.

Here are all the distributions that are currently implemented in Edward, there are more to come:

Which all can be used to make a Mixture Density Networks. Let start by doing the imports.