How to evaluate log-likelihood density at any theta in parameter space?

Hi all,

I feel stupid because I can’t seem to find how to do such a basic thing, but I’d like to evaluate the log density log(Pr(D, theta)) at a given theta for the model at hand (in which D is the observed data).

FWIU this computation is at the heart of what PyMC does for running HMC, this is why I’m surprised I can’t find how to compute it.

Motivation: the general context for this is implementing the Laplace Method for approximating the posterior, in particular as a way of estimating Model Evidence for model comparison, in “complex-observations” situations where Information Criteria are not directly applicable. I’m already able to compute the theta_MAP and the corresponding Hessian, all I’m lacking now is a robust way of resolving log(Pr(D, theta_MAP)); I guess I could use the optres.fun returned by scipy.optimize.minimize, but that seems like relying on a fragile implementation detail of PyMC.

2 Likes

If you are using V4: model.compile_logp()(**theta) gives you that. The data is baked in, so you don’t need to pass it explicitly.

If you need the data to change across evaluations you can use pm.MutableData during model construction and set new data between evaluations.

It is also possible create a function where data is an explicit input, but it’s not as straightforward. Let me know if you need a snippet or if the suggestions above suffice.

2 Likes