Multinomial mixture with observed-values themselves as mixtures

Hi,

I’m new to Pymc3 and it seemed like a great tool for a problem I am encountering, but I am having trouble with the last bit of code.
Briefly, I have two multinomial distributions representing disease models (each disease has the same three objects present in imaging data, but in different frequencies). I would like to model an instance where both diseases are present in imaging data as mixture. This would be straight-forward if each observation (patch of the image) came from from disease A or B, but it is more realistic that each observation will be a mixture itself (some convolution of disease A and B).

Currently what I have is below:

Setting up our prior and underlying parameters:

#Number of image patches
N=100

#Number of objects in each image patch
objPerPatch = 10

#Initial guess for the fraction of the two diseases present
beta = np.array([0.5, 0.5])

#True frequency of our diseases in image patches
beta_true = np.array([0.7, 0.3])

#Frequency for three objects in each disease
alphas_true12 = np.array([[0, 0.2, 0.8],[0.8, 0.2, 0]])

Simulating the data:

observed_data12 = np.zeros([N,3])
#Each image patch for simplicity is the same combination of the two multinomials 
# and will always the same object count 
 for m in range(N):
    observed_data1 = np.random.multinomial(objPerPatch*beta_true[0], alphas_true12[0], size=1)
    observed_data2 = np.random.multinomial(objPerPatch*beta_true[1], alphas_true12[1], size=1)
    observed_data12[m] = observed_data1+observed_data2

Creating the model:

with pm.Model() as simple_comp_model:
   #Underlying probability we want to infer
    p1 = pm.Uniform('p', 0, 1)
    p2 = 1 - p1
    p = T.stack([p1, p2])

    #Get distribution of our multinomials
    obj_dist = [pm.Multinomial.dist(p=alphas_true12[i], n=objPerPatch, shape=3) for i in range(2)]

    #*This would be the approach if each image patch was assigned to one disease
    #disease_mix = pm.Mixture('disease_mix', w = p, comp_dists = obj_dist, observed=observed_data12)

    #Now get the distribution of this mixture? Unclear
    disease_mix = pm.Mixture.dist( w = p, comp_dists = obj_dist)
    
    #Treat problem as mixture of mixture (no bueno)
    x = pm.Mixture('x',w = [1,0], comp_dists = disease_mix,observed=observed_data12)

When I attempt this, I get TypeError: ‘Mixture’ object is not iterable

with simple_comp_model:
    trace = pm.sample(draws=3000,chains=1, tune=2000,discard_tuned_samples=True)

I’ve looked at pm.Potential and pm.DensityDist rather than pm.Mixture, but to no avail. I feel that I’m missing one last line to wrap it up, but I can’t figure it out.

Apologies if this problem has already been answered. I looked through other questions but none seemed to be quite the same.

Are you using the current development master branch or version 3.6?

I’m using version 3.5

Try updating to the current development branch on GitHub. We made many changes that will be released in 3.7 and your problem could have been solved there.

Hi, so I got the latest version but am still getting the same error. I’m not entirely convinced that my code should work as written (even in the updated version), but it’s my closest approximation to what I want to do, which is treat my individual observed values as mixtures.

Could you add your entire traceback? It’s true that your problem doesn’t seem to apply well to a mixture of observeds. As you write down your data generation progress, you are adding the two multinomials together. A mixture assumes that you draw a sample from one or the other with some probability, this won’t be distributed as the sum. To get something like the sum you should use a Potential.

Going back to your mixture error, I think that it may be caused by not having supplied the disease mix shape, so it maybe isn’t a problem of the Mixture class implementation. Nevertheless, I’m curious to try to figure out why you’re getting an exception, so your full traceback should help.

Sure thing, I’ve copied it below. Also, just to verify, I used the ‘pip install git+https://github.com/pymc-devs/pymc3’ command to get the current development master branch; is this correct? I’m still seeing version 3.6 so I just wanted to make sure.

AttributeError                            Traceback (most recent call last)
/miniconda/lib/python3.6/site-packages/pymc3/distributions/mixture.py in _comp_means(self)
    281         try:
--> 282             return tt.as_tensor_variable(self.comp_dists.mean)
    283         except AttributeError:

AttributeError: 'Mixture' object has no attribute 'mean'

During handling of the above exception, another exception occurred:

TypeError                                 Traceback (most recent call last)
<ipython-input-13-239082351f42> in <module>()
     38 
     39     #Treat problem as mixture of mixture (no bueno)
---> 40     x = pm.Mixture('x',w = [1,0], comp_dists = disease_mix,observed=observed_data12)
     41 
     42 

/miniconda/lib/python3.6/site-packages/pymc3/distributions/distribution.py in __new__(cls, name, *args, **kwargs)
     39                 raise TypeError("observed needs to be data but got: {}".format(type(data)))
     40             total_size = kwargs.pop('total_size', None)
---> 41             dist = cls.dist(*args, **kwargs)
     42             return model.Var(name, dist, data, total_size)
     43         else:

/miniconda/lib/python3.6/site-packages/pymc3/distributions/distribution.py in dist(cls, *args, **kwargs)
     50     def dist(cls, *args, **kwargs):
     51         dist = object.__new__(cls)
---> 52         dist.__init__(*args, **kwargs)
     53         return dist
     54 

/miniconda/lib/python3.6/site-packages/pymc3/distributions/mixture.py in __init__(self, w, comp_dists, *args, **kwargs)
    130 
    131             try:
--> 132                 self.mean = (w * self._comp_means()).sum(axis=-1)
    133 
    134                 if 'mean' not in defaults:

/miniconda/lib/python3.6/site-packages/pymc3/distributions/mixture.py in _comp_means(self)
    283         except AttributeError:
    284             return tt.squeeze(tt.stack([comp_dist.mean
--> 285                                         for comp_dist in self.comp_dists],
    286                                        axis=-1))
    287 

TypeError: 'Mixture' object is not iterable

I also looked at Potential, but it’s a little unclear to me. It doesn’t seem to take observed values as a input. Are they somehow incorporated in an earlier step?

Ah, now i see the problem. The first mixture correctly identifies that its components are discrete, do it does not set the mean, whereas the second mixture does not understand that the first mixture should also be discrete, it thinks it would be continuous, looks for the mean method and fails. Could you raise this as an issue on GitHub?

About the potential, I spoke too soon. I’m not sure that you can write down observations on the sum of two mixtures (you can’t put observations on deterministic RVs). Maybe @junpenglao can give you some advice on how to approximate what you want.

There are a few posts here on the discourse about mixture of mixture, you can have a look and give it a go.

However, I am still not convinced that you can correctly inference these kind of model. You might need something like a semisupervised model in HMM