I am trying to model a mixture distribution that has a pretty strict cut-off at -100. Here’s a brief overview of the data (intuitively, there are 3 centers/clusters).
I first add 100 to all my dps so that cutoff is at 0. My main options that I have tried are listed below with results:
- NormalMixture as a base model: This means that the model allows for < -100 points to exist, that’s fine for now… Using the same code from Austin Rochford’s model, I get extremely slow sampling speed for NUTS (~1.5 it/s). Modifying the tau component so that it is now sd, it is much faster. However, the
mu
components have nearly 0 sd.
with pm.Model() as model:
w = pm.Dirichlet('w', np.array([1.]*3))
mu = pm.Normal('mu', [0., 100., 200.], 10., shape=3)
sd = pm.Exponential('sd', 0.05, shape=3)
x_obs = pm.NormalMixture('x_obs', w, mu, sd, observed=obs)
See plots below

Blue = sample_ppc, green = data
- Mixture class: Two Normal distribution one bounded distribution (e.g HalfCauchy, Exponential, etc). Chains doesn’t converge, no image but it runs around like a random walk with beta failing to sample
with pm.Model() as model:
w = pm.Dirichlet('w', np.array([1.]*3))
mu_duel = pm.Normal('mu_duel', 100., 30.)
sigma_duel = pm.Exponential('sd_duel', 0.01)
mu_kill = pm.Normal('mu_kill', 200., 30.)
sigma_kill = pm.Exponential('sd_kill', 0.01)
beta = pm.Exponential('beta', 2)
duel = pm.Normal.dist(mu_duel, sigma_duel)
kill = pm.Normal.dist(mu_kill, sigma_kill)
death = pm.HalfCauchy.dist(beta)
order_means_potential = pm.Potential('order_means_potential',
tt.switch(mu_kill-mu_duel < 0, -np.inf, 0))
x_obs = pm.Mixture('x_obs', w, [death, duel, kill], observed=obs)
Attached is my datasetobs.npy (16.5 KB)
Anything i can do? should I try a non-marginalized version of the gaussian mixture?