Hi thanks for the reply.
I am trying to rewrite the model using pm.Mixtures, by following other discussions:
However, I find it difficult to handle the fact that observations are “nested” in subjects. I tried several models (attached only two, i.e. Model 1 and Model2), but none works. Could you be a bit more specific about the way of rewriting the model?
Thanks.
Model 1
k=2
#x_t = time
#x_time = shared(x_t)
x_t = time[:,np.newaxis]
x_time = shared(x_t,broadcastable=(False,True))
with Model() as varying_intercept:
# Priors
mu_a = Normal('mu_a', mu=0., sd=0.1,shape=k)
sigma_a = HalfCauchy('sigma_a', 5)
mu_b = Normal('mu_b', mu=0., sd=0.1,shape=k)
sigma_b = HalfCauchy('sigma_b', 5)
# Random intercepts
a = Normal('a', mu=mu_a, sd=sigma_a, shape=(n_sbj,k))
b = pm.Normal('b',mu=mu_b,sd=sigma_b,shape=(n_sbj,k))
# Model error
sd_y = HalfCauchy('sd_y', 5)
# Expected value
y_hat = a[subjects,:] + b[subjects,:] * x_time
# Data likelihood
p = pm.Dirichlet('p', np.ones(k))
likelihood = pm.NormalMixture('likelihood', p, y_hat, sd=sd_y, observed=observed)
Auto-assigning NUTS sampler…
Initializing NUTS using jitter+adapt_diag…
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [p_stickbreaking__, sd_y_log__, b, a, sigma_b_log__, mu_b, sigma_a_log__, mu_a]
100%|██████████| 6000/6000 [01:26<00:00, 69.28it/s]
The gelman-rubin statistic is larger than 1.4 for some parameters. The sampler did not converge.
The estimated number of effective samples is smaller than 200 for some parameters.
Model 2
k=2
x_t = time
x_time = shared(x_t)
with Model() as varying_intercept:
# cluster sizes
p = pm.Dirichlet('p', np.ones(k))
mu_a = pm.Normal('mu_a', mu=0, sd=0.1, shape=k) # intercept
mu_b = pm.Normal('mu_b', mu=0, sd=0.1, shape=k) # slope
sd_a = HalfCauchy('sd_a', 5,shape=k)
sd_b = HalfCauchy('sd_b', 5,shape=k)
a = pm.NormalMixture('mix_mu_a', p, mu_a, sd=sd_a,shape=n_sbj)
b = pm.NormalMixture('mix_mu_b', p, mu_b, sd=sd_b,shape=n_sbj)
# Expected value
y_hat = a[subjects] + b[subjects] * x_time
#Model error
sd_y = HalfCauchy('sd_y', 5)
# Data likelihood
y_like = Normal('y_like', mu=y_hat, sd=sd_y, observed=observed)
Auto-assigning NUTS sampler…
Initializing NUTS using jitter+adapt_diag…
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [sd_y_log__, mix_mu_b, mix_mu_a, sd_b_log__, sd_a_log__, mu_b, mu_a, p_stickbreaking__]
100%|██████████| 6000/6000 [00:37<00:00, 160.68it/s]
There were 1 divergences after tuning. Increasetarget_accept
or reparameterize.
The gelman-rubin statistic is larger than 1.4 for some parameters. The sampler did not converge.
The estimated number of effective samples is smaller than 200 for some parameters.