Pm.sample_ppc breaks with AttributeError

If you are going to reuse it later on, wrapping the logp and the random method into a new distribution might worth the effort. Otherwise I would just write the logp function, and do ppc via selecting points from the trace and generate random samples.

Also some tips:

  • Mixture model is difficult to sample, check out related post here on discourse and also my gist for more information on how to model it properly
  • try NUTS when possible, looking at the trace from the notebook, i dont think Metropolis converge
  • n_init is probably not doing what you intend to do. You should control the number of tunning samples using tune parameter
  • in your example, the boundPoisson is not really doing anything, as the observed is bounded already between 1 and 70.

So what I would do is something like:

from pymc3.math import logsumexp
with pm.Model() as x2model:
    alpha = pm.Gamma('alpha', 1, 1)
    beta = pm.Beta('beta', 1, alpha, shape=K)
    w = pm.Deterministic('w', stick_breaking(beta))
    eta = pm.Uniform('eta', 0, 20, shape=K)
    delta = pm.Uniform('delta', 0, 10., shape=K)
    mu = pm.Deterministic('mu', eta + delta * x1_shift)
    # use a potential to add the mixture logp to the model
    pm.Potential('logp', logsumexp(tt.log(w) +
                                   pm.Poisson.dist(mu).logp(x2obs), axis=-1).sum())

After you do the inference, posterior prediction check:

from scipy.stats import poisson
wpost = x2trace['w']
mupost = x2trace['mu']
ndraws = wpost.shape[0]
nppc = 100
ppc = np.zeros((nppc, len(x2obs)))
for n in range(nppc):
    idx = np.random.randint(ndraws)
    pchoice = np.random.choice(K, len(x2obs), p=wpost[idx, :])
    ppc[n, : ] = poisson.rvs(mupost[10, :, :])[range(len(x2obs)), pchoice]