Hi Zweli,
thanks for your support ![]()
The convergence is indeed very good.
And I am surprised, that the b_ij_binary still come with a good convergence. I only need to re-calculate the sample weights via b_ij_binary and c.
Another approach, that I identified is via the ZeroSumNormal. That works decently well.
Benefit: I can directly see the means in the sampled parameters. However, that does not work for the standarddeviations that are defined for the ZSN as
\sigma^2 (I - \tfrac{1}{n}J) \Big) \\ \text{where} \ ~ J_{ij} = 1 \ ~ \text{and} \\ n = \text{nbr of zero-sum axes}
with pm.Model() as constrained_model_rp_zs:
"""constrained model with reparametrization and zero sum normal distribution"""
#Input data
x = pm.Data("x", xdata, dims="obs_id")
b = pm.Data("b", bdata, dims="obs_id")
#baseline
mu_a = pm.Normal("mu_a ", mu=0.0, sigma=10)
sigma_a= pm.Exponential("sigma_a", 1)
# level 1
sigma_c= pm.Exponential("sigma_c", 1, shape=n)
# reparametrization is needed since ZSN has the same sigma for all dimensions
c_dist = pm.ZeroSumNormal("c_dist", sigma=1, shape=n)
c = pm.Deterministic("c", sigma_c * c_dist)
y_hat = mu_a + c[b]
# value
value = pm.Normal("value",
mu=y_hat,
sigma = sigma_a,
observed = x)
