Let us consider the following Hierarchical Bayesian model:
- w \sim\ Beta(20, 20)
- K = 6
- a = w * (K - 2) + 1
- b = (1 - w) * (K - 2) + 1
- theta \sim\ Beta(a, b)
- y \sim\ Bern(theta)
The above is the example of figure 9.3 in the book Doing Bayesian Data Analysis by John Kruschke.
I use the pymc3 to construct the model as follows:
import numpy as np
import pymc3 as pm
import arviz as az
K = 6
D = np.array([1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0])
with pm.Model() as mdl:
w = pm.Beta('w', alpha=20., beta =20.)
a = w * (K - 2) + 1
b = (1 - w) * (K - 2) + 1
theta = pm.Beta('theta', alpha=a, beta=b)
y = pm.Bernoulli('y', p=theta, observed=D)
trace = pm.sample(10000, tune=3000)
My question is the following:
- How can I compute the conditional posterior probability p(theta|w=0.25,D)? In other words how can I obtain samples from the conditional posterior distribution of theta|w=0.25, D?