Help navigating implementation of new inference algorithm

Again, can’t thank you enough, seeing that notebook was invaluable and rapidly got me up to speed on Aesara stuff I needed to know.

Based on that, I’ve got the skeleton of a package going. I fixed one algorithmic issue with what you had, which was that the \frac{d}{d\theta} \log \mathcal{P}(x,z\,|\,\theta) was picking out the wrong term in the model (it was using only the \theta “node” but actually its the z one thats important). That’s likely why it wasn’t working for you. You can see a demo of early API here:

https://cosmicmar.com/muse_inference/demo.html#With-PyMC

with the code here: muse_inference/pymc.py at main · marius311/muse_inference · GitHub

I had a couple of followup detailed questions as I’ve gotten more familiar with stuff:

  1. Is there a way to get the observed value out of a RV, so as to avoid the user needing to do the prob.x = x_obs line?
  2. Is there a way to pass a RandomState-type thing to the sampling functions to make things reproducible? Figured out how to set the Model.rng_seq as needed to make sample_prior_predictive reproducible (thats the only sampler I needed)
  3. As you can see on that page, right now the PyMC version is 30x slower than Numpy and 5x slower than Jax. I’ve done absolute no profiling yet so its maybe too early, but I was wondering if anything glaringly obvious stands out?

Still plenty of work left, including the vector concat-ing stuff from your notebook which I still need to work through.

2 Likes