Affine Invariant Markov chain Monte Carlo (MCMC) Ensemble sampler

Want to highlight something @cluhmann said,

You just build a function that takes parameter values and returns a log posterior probability

As in, you don’t additionally provide the gradient of the log posterior probability to emcee, like you would have to for HMC or NUTS. This is the big advantage of the Goodman & Weare sampler (emcee). Sometimes to calculate your likelihood you have to run some big computer simulation or something, which isn’t possible to differentiate through. In that case you cant really use HMC, and so emcee is a great choice. It’s popular in Astronomy and Physics because of this, lot’s of big computer simulations.

If your model is an AR1, or some variant of, that’s definitely differentiable, so I’d bet you’d have good results with HMC/NUTS. I’m very biased of course but I do think the easiest way would be to implement it in PyMC.

2 Likes