Hi all,
I’m trying to get the posterior predictive samples which has a poisson distribution. I understand how to successfully sample from the posterior gaussian process but not the posterior y_pred
as shown below.
Any thoughts on how to do this the pymc3 way?
with pm.Model() as model:
ℓ = pm.Gamma("ℓ", alpha=2, beta=2)
η = pm.HalfCauchy("η", beta=3)
cov = η**2 * pm.gp.cov.ExpQuad(1, ℓ)
gp = pm.gp.Latent(cov_func=cov)
f = gp.prior("f", X=train_x)
# adding a small constant seems to help with numerical stability here
y_ = pm.Poisson("y", mu=tt.square(f) + 1e-6, observed=train_y)
tr = pm.sample(20, tune=10, cores=4)
The following fails:
test_x = np.arange(60)[:,None]
with pm.Model() as model:
f_pred = gp.conditional("f_pred", test_x)
y_pred = pm.Poisson("y_pred", mu=tt.square(f_pred) + 1e-6)
pred_samples = pm.sample_posterior_predictive(tr,
vars=[f_pred, y_pred],
samples=100)
The hacky way (by using numpy’s poisson sampler):
test_x = np.arange(60)[:,None]
with pm.Model() as model:
f_pred = gp.conditional("f_pred", test_x)
pred_samples = pm.sample_posterior_predictive(tr,
vars=[f_pred],
samples=100)
lam = pred_samples['f_pred']**2+1e-6
y_pred = np.random.poisson(lam)
I’ve included a colab notebook with a toy dataset.