Manually update part of the model without writing a custom sampler

I want to use NUTS to sample part A of the model, and update the remaining part B of the model manually and deterministically. Part B contains categorical distributions (https://github.com/pymc-devs/pymc3/issues/1902).

Log-likelihood of A and log-likelihood of B share some intermediate calculations C. I plan to define custom theano operations for A, B, and C to avoid evaluating C for twice (Custom operation for likelihood and cache value of likelihood).

I will call the theano operation for log-likelihood of A as α.

I am thinking about the following:

  1. store the parameters of B in α
  2. update parameters of B within the perform method of α.

But that means α is not a pure function. The following two cases give different results:

  1. Evaluating α at point x first and then at point y
  2. Evaluating α at point y first and then at point x

Would that cause any trouble if I use NUTS?

Within each step of NUTS, how many times would NUTS call the perform method for the same point? Can I still avoid evaluating C for the same point twice?
Does NUTS use the same instance of α across all chains?

Thanks again.

It shouldnt be a problem if you write the update as a step_methods, as in the compound step the update is sequential and always ordered in the same way. You can have a look at this notebook that I implemented the laplace approximation so you can do approximation+sampling:

another example is to do EM: https://github.com/junpenglao/Planet_Sakaar_Data_Science/blob/master/Ports/Inferencing%20Linear%20Mixed%20Model%20with%20EM.ipynb

1 Like

Thank you very much!

I can see the notebook for EM.

I found the notebook for laplace approximation: https://github.com/junpenglao/Planet_Sakaar_Data_Science/blob/master/Ports/Laplace%20approximation%20in%20pymc3.ipynb