Is reversible jump (RJMCMC) feasible?

I would like to construct a reversible jump MCMC to account for variation in dimensions. An example use case would be a clustering algorithm where the number of clusters is unknown. Is this feasible in PyMC3? So far I have only found references to this in a closed github issue and a very old question on a google group.

Depends on your model set up but in general I think this is not easy to do in a static graph, as you can not add or remove node once the graph is compiled.

You can have a look at Dirichlet process mixtures for density estimation doc where the number of component is unknown (but truncated).

You can assume some minimum and maximum dimension and have thus an underlying fine discretization. these also would be the the values in the chains that will be stored. Then you drag along an additional discrete index variable which would be then the index to each dimension. I am currently working on it as well. I will let you know once it is working. Another problem however is indeed the sampling algorithm. I think at the moment there is no sampler implemented that might be able to efficiently sample these kind of problems. The gradient based samplers do not work because you do not have a gradient available, and the proposal distribution for standard Metropolis or SMC is somewhat problematic. And Standard Metropolis will fail and get stuck anways in the complex objective function. You may want to give the new DEMetropolis sampler a try. But RJMCMC would be indeed an nice contribution!


I am keen to find an implementation of reversible jump in Python. Can anyone tell me, has any progress on this been made since since year-old thread?

1 Like