Hello,
I am using PyMC3 to perform Poisson regression and I am getting results that don’t make sense to me. I am trying to model counts of the number of a certain type of “adverse health events” (exacerbations of a disease) based on a number of other covariates. Some of these are factors, while others are continuous. There is also an offset/exposure for the length of observation time for each subject.
The issue is that after fitting the model (using NUTS), which shows no major diagnostic issues, such as divergence or large Rhat values, my posterior predictive samples produce counts that are roughly half of the observed counts. That being said, I did have to pump of the target_accept
to 0.99 to get the model to fit without numerical errors.
Here is the model:
with pm.Model() as model:
model_intercept = pm.Normal('model_intercept', mu=0, sigma=1000)
factor_coefs = pm.Normal('factor_coefs', mu=0, sigma=1000, shape=len(dummies.columns))
covariate_coefs = pm.Normal('covariate_coefs', mu=0, sigma=1000, shape=len(covariates.columns))
mu = pm.Deterministic(
'mu',
model_intercept +
pm.math.dot(dummies, factor_coefs) +
pm.math.dot(covariates, covariate_coefs) +
np.log(exposure)
)
err = pm.HalfCauchy('err', beta=10)
log_rate = pm.Normal('log_rate', mu=mu, sigma=err, shape=n_obs))
numex = pm.Poisson('numex', mu=pm.math.exp(log_rate), observed=numex_obs)
I then fit the model with:
with model:
trace = pm.sample(2000, tune=2000, init='advi', target_accept=0.99)
ppc = pm.sample_posterior_predictive(trace)
Here is the plot showing the actual observed counts (x-axis) vs the posterior predictive counts (y-axis; averaged over the posterior samples):
As you can probably see, the trend appears to be roughly correct (and the model coefficients do make sense to a large extent), but counts seem to be biased to be too low after fitting to large extent. I have also tried using a negative binomial distribution for the counts, but the result is the essentially the same.
Any suggestions on the model, or reasons why this is happening would be greatly appreciated. Thanks!