Hello, I am having difficulty implementing a mixture model (two distributions). On each of n trials I have 1000 observations. For each trial, I want the model to assume that the observations originate from a mixture of a von mises distribution and a uniform distribution. However, the mu of the von mises distribution must come from one of four possible uniform priors, where the selected prior varies according to the trial.
Below is my attempt at using a categorical distribution to determine which of the four priors to use, but I run into problems indexing mus in this example. (“idx” is a 1d array of length 1000 x ntrials that specifies what trial each observation belongs to, and “X” is a 1d array of observations that is also length 1000 x ntrials)
with pm.Model() as model:
mus = [pm.Uniform('mu_%d' % i,
lower = (2*np.pi)/4*i - np.pi,
upper = ((2*np.pi)/4*i + (2*np.pi)/4) - np.pi)
for i in range(4)]
mu_p = pm.Dirichlet('mu_p',a=np.ones(4),shape=(ntrials,4))
cat = pm.Categorical('cat',p=mu_p,shape=ntrials)
p = pm.Dirichlet('p',a=np.ones(2),shape=(ntrials,2))
dists = [
pm.VonMises.dist(mu=mus[cat[idx]], kappa=np.ones(1000)),
pm.Uniform.dist(lower=np.full(1000,-np.pi), upper=np.full(1000,np.pi)),
]
y = pm.Mixture('y', w=p, comp_dists=dists, observed=X)
Any help is greatly appreciated, I have looked into other posts on the forum already and made attempts such as marginalization but I have repeatedly run into problems.
Edit: The error is the mus[cat[idx]] part, it leads to an error: “TypeError: index must be integers.”
VonMises + Uniform mixture is pretty unidentifiable (https://twitter.com/junpenglao/status/968492643444502528?s=20), and with another mixture on top of VonMises you probably would have a pretty difficult time fitting the model.
2 Likes
Thanks for the reference! For the sake of my question it’s not specific to von Mises, though, for instance I can swap out the von Mises for a Normal distribution (this code is just an example to help convey the question):
with pm.Model() as model:
mus = [pm.Uniform('mu_%d' % i,
lower = i,
upper = i+1,
for i in range(4)]
mu_p = pm.Dirichlet('mu_p',a=np.ones(4),shape=(ntrials,4))
cat = pm.Categorical('cat',p=mu_p,shape=ntrials)
p = pm.Dirichlet('p',a=np.ones(2),shape=(ntrials,2))
dists = [
pm.Normal.dist(mu=mus[cat[idx]], sigma=np.ones(1000)),
pm.Uniform.dist(lower=np.full(1000,-np.pi), upper=np.full(1000,np.pi)),
]
y = pm.Mixture('y', w=p, comp_dists=dists, observed=X)
I cannot seem to figure out how to have the Normal distribution’s mu be dependent on the output of the categorical distribution (i.e., “mus[cat[idx]]” leads to an error, “TypeError: index must be integers.”).
what is the dtype of idx
? Check cat[idx]
to see if it has the intended shape.
Ah, you’re right! idx was the wrong shape. But after fixing that I get the error “TypeError: list indices must be integers or slices, not TensorVariable”.
Sorry, I think my initial question should have been more straightforward: how do I take a list of random variables (here, “mus”) and index that list using a tensor variable (here, “cat”)? Thanks so much for your time by the way, I truly appreciate it.
casting mus
to theano tensor would work, but a better practice is to do:
mus = pm.Uniform('mus', lower=np.asarray([0, 1, 2, 3]), upper=np.asarray([1, 2, 3, 4]))
2 Likes
Casting to a theano tensor seems to work! Hooray! Although your example alternative does not, with the error: “TypeError: For compute_test_value, one input test value does not have the requested type. The error when converting the test value to that variable type: Wrong number of dimensions: expected 0, got 1 with shape (4,).”
For reference to anyone who might come across this post:
mus_tensor=tt.real(mus) # casting list to theano tensor
print(mus_tensor[cat[0]]) # works! :)
print(mus[cat[0]]) # errors :(
Thanks so much Junpeng, you saved me from days of confusion.
1 Like
You are welcome, for reference, you might need to specify the shape
mus = pm.Uniform('mus',
lower=np.asarray([0, 1, 2, 3]),
upper=np.asarray([1, 2, 3, 4]),
shape=4)
2 Likes
This is only the case if the estimated VonMises is rather wide (in which case it approaches a uniform distribution), or is it true more in general?
My take is true in general.