Strangely, sometimes adding a totally disconnected node to a hierarchical model will result in the sampling to go completely random. For example, the following formulation will converge:
with pm.Model() as model:
x1_value = pm.Bound(pm.Normal, lower=0.0)('bed_value', mu=100.0, sd=30.0, testval=100)
intercept = pm.Bound(pm.Normal, lower=0.0)('intercept', mu=100.0, sd=20.0, testval=100)
mu = intercept\
+ x1_value*self.shared_data['x_1']\
variance = pm.HalfNormal('variance', 0.05)
nu = pm.Uniform('nu', 1, 15)
# Likelihood (sampling distribution) of observations
obs = pm.StudentT('obs', mu=mu, sd=variance*mu, nu=nu, observed=self.shared_data['shared_y'])
return model
But simply adding this x2_value parameter will break the sampling, and the trace will simply show the prior distribution:
with pm.Model() as model:
x1_value = pm.Bound(pm.Normal, lower=0.0)('bed_value', mu=100.0, sd=30.0, testval=100)
x2_value = pm.Bound(pm.Normal, lower=0.0)('bath_value', mu=50.0, sd=10.0, testval=50)
intercept = pm.Bound(pm.Normal, lower=0.0)('intercept', mu=100.0, sd=20.0, testval=100)
mu = intercept\
+ x1_value*self.shared_data['x_1']\
variance = pm.HalfNormal('variance', 0.05)
nu = pm.Uniform('nu', 1, 15)
# Likelihood (sampling distribution) of observations
obs = pm.StudentT('obs', mu=mu, sd=variance*mu, nu=nu, observed=self.shared_data['shared_y'])
return model
The worst part, is then removing the x2_value parameter doesn’t fix the issue (running the same code as in the first example no longer works). This makes it really hard to iterate on the model specification. Any ideas as to what’s going on?