tl;dr: I hacked the emcee --The MCMC-Hammer ensemble sampler to work on PyMC models.

PyMC is an awesome Python module to perform Bayesian inference. It allows for flexible model creation and has basic MCMC samplers like Metropolis-Hastings. The upcoming PyMC3 will feature much fancier samplers like Hamiltonian-Monte Carlo (HMC) that are all the rave these days. While those HMC samplers are very powerful for complex models with lots of dimensions, they also require gradient information of the posterior. AutoDiff (as part of Theano ) makes this very easy for exponential family models.

However, in the real world of Big Science, we often do not have likelihoods from the exponential family. For example, the likelihood function of HDDM -- our toolbox to do hierarchical Bayesian estimation of a decision making model often used in mathematical psychology -- has an infinite sum and requires numerical integration which make it non-differentiable. In many other areas of science, likelihoods result from complex non-linear models that take a while to compute as described in this recent post on Andrew Gelman's blog.

Largely separated from the hype around HMC that machine learners and statisticians love, astrophysicists have been solving this problems in other ways. Ensemble Samplers with Affine Invariance do not require gradient information but instead run many samplers (called "walkers" -- Walking Dead, anyone?) in parallel to cover the space. With expensive likelihoods, having parallel sampling (something that's not possible with HMC yet) is a game changer.

For a fun visual depiction of HMC vs Ensemble Samplers, see these two videos (make sure to crank your speakers to the max and dance as if the NSA wasn't watching):