How to save logp values during sampling?

I asked a similar question here, but this time I’d like know if there’s a way to add the logp values of accepted points to the trace as we sample, so that we don’t have to evaluate it twice?

You are asking some great questions :wink: These should go into our docs I think…
There is two way to do it, the easiest is to create a Deterministic RV to save the logp. This is what we do in SMC currently:

A more complicated way is to put the logp into the sampler statistics of the trace, there is a discussion here:


Interesting thread! Is there any way to save the logp for each data point as an array? So that each element becomes a vector instead of a scalar?

Depending on how you define the logp for each data point (i.e., element-wise logp), you can computed it using the logprob function of the observed conditioned on the posterior using:

Thanks for you help. Additionally, is it possible to use this capability with a new set of observed values (y)?
I want to make a dataframe with the following columns: [parameter 1, parameter 2, Y, (log)likelihood]
This will allow me to plot multiple paths of the PDF of the distribution in a Baysian way. I could not find this functionality in the package. Thanks!

I dont think there is built in function for that. I would probably extract the parameters from the trace by hand and compute the logp conditioned on the new observation by hand.

1 Like

I know this is an old issue, but I thought I’d share a solution to compute the logp of a set of new observations. You can use the same trick as above and define a Deterministic RV for the logp. Then use sample_posterior_predictive. Assuming you’ve already produced a posterior sample trace, and y_future is your new observed variable, use the following code:

with model:
    llk = pm.Deterministic('llk', y_future.logpt)
    logp = pm.sample_posterior_predictive(trace, vars=[llk], keep_size=True)

Set keep_size=True to compute the logp for each sample of the trace and retain the shape ((n_chains, n_samples)).

1 Like