Hello, I’m a newbie when it comes to baysiean modeling, and i’m trying to get a poisson mixture model to work. It samples fine, but the trace looks off, and the centers aren’t where they are found in the data.
Something must be wrong with my model, can anyone help diagnose?
I’ve tried
but am unsure how to port it to a poisson distribution, modified my code to represent
Here is the data generating process:
Here is the model:
Here is the trace:
And here is where it shows the centers to be:
Thanks for the help.
This is what finally solved it:
with pm.Model() as model_1:
p = pm.Dirichlet('p', np.array([1,1,1]), shape = 3)
c = pm.Categorical('c', p, shape = o.shape[0])
centers = pm.Uniform('centers',lower = 1, upper = 140 , shape = 3)
c_i = pm.Deterministic('c_i',centers[c])
obs = pm.Poisson('obs',c_i, observed = o)
1 Like
Hi Jacob,
Thanks for your question. Your solution is correct You have a three-component mixture model and you need a Dirichlet prior for p rather than three independent uniforms. For others who may encounter this thread, having a Dirichlet prior assures that p.sum() == 1
. This is also why you were getting two centers rather than three in your results.
You could also leverage pm.Mixture
for mixture problems like this. You wouldn’t necessarily need to define indices (c
variable in the code above) and I believe that sampling would be faster. This would be particularly appealing in cases where you’re interested in inference of parameters, such as p
and centers
here, rather than indices. Here is a link to a Gist that pertains to what I just described with the Poisson mixture example: poisson-mixture.ipynb · GitHub
Hope this helps!
Larry
1 Like
Thanks a lot for the speed up Larry! I appreciate it. I guess a follow up question would be, why aren’t my chains sorted by converged value? I notice that they all converge ( and to correct values), but the it seems each chain isn’t “aligned” in the centers trace, which leads to very inaccurate values when trying to take the mean of the chain(even though the trace shows all converged correctly)
In below picture: Look at the columns, it seems they are misaligned, is this due to the Dirichlet distribution being instantiated every sample?
Is there any way to correct for this?
Cheers!
@larryshamalama can comment, but it looks like you are running into the problem of label switching, which is common in mixture modeling. Check out this thread for some additional info and some solutions (imposing an ordered transformation seems to be the currently recommended strategy).
2 Likes
Tried applying ordered transformations on both the Dirichlet prior and the centers prior with pm.distributions.transforms.ordered, but run into inital value errors, even if I give them the initval arguements.
EDIT; See my follow up post for solution
Looking quickly, my strongest guess is that an ordered transform would solve the problem. I can have a closer look. I believe that you should only apply the transform on centers
though, not the Dirichlet prior.
1 Like
Wow thanks! I tried again with just the transform on centers, and it required that I input different initvals for each index, would throw an error otherwise.
Here’s the model that worked, with @larryshamalama 's mixture reccomendations.
with pm.Model() as model_4:
p = pm.Dirichlet('p', np.array([1,1,1]))
centers = pm.Uniform('centers', 0,140, shape = 3,transform = pm.distributions.transforms.ordered, initval = [1,2,3])
mixed = pm.Mixture('mixed', w = p, comp_dists = pm.Poisson.dist(centers), observed = x)
2 Likes