How to use a DensityDist in a Mixture?

I’m having hard time to create a Mixture of DensityDist.

In order to have a short code example I will create a very basic DensityDist that is just a cover over a Normal(Gaussian) distribution.

d1 =  np.random.normal(loc=1,scale=1,size=1000) # toy data set

with pm.Model() as model:
    mu = pm.HalfNormal('mu',sd=1,shape=1)
    sd = pm.HalfNormal('sd',sd=1,shape=1)
    g = pm.Normal.dist(mu=mu,sd=sd,shape=1)
    obs = pm.DensityDist('obs',logp=g.logp,random=g.random,observed=d1)

This works as expected : mu & sd converge toward 1 & 1 (like the parameter in the toy data set)

Now I try to build a mixture with such ‘DensityDist’. I add a dimension (I increase the ‘shape’ param), and add the mixture creation code, and the observable is now the mixture :

# toy dataset
d1 =  np.random.normal(loc=1,scale=1,size=1000)
d2 =  np.random.normal(loc=2,scale=2,size=1000)
data = np.concatenate([d1,d2]) # a bimodal toy data set

with pm.Model() as model:
    mu = pm.HalfNormal('mu',sd=1,shape=2)
    sd = pm.HalfNormal('sd',sd=1,shape=2)
    g = pm.Normal.dist(mu=mu,sd=sd,shape=2)
    comp = pm.DensityDist('comp',logp=g.logp,random=g.random,shape=2) 
    w = pm.Dirichlet('w',a=np.array([1,1]))
    mix = pm.Mixture('mix',w=w,comp_dists=comp,observed=data)

    trace = pm.sample(draws=1000, tune=1000,chains=1)     
    pm.traceplot(trace);plt.show()

This outputs an error message :
ValueError: length not known: comp [id A]
linked to the mixture code line.

I don’t understand it’s meaning, but I remember that the mixture components need a ‘.dist’ extension, Then I change the component line to :

comp = pm.DensityDist.dist(logp=g.logp,random=g.random,shape=2)

Then the error message become :
TypeError: 'DensityDist' object is not iterable
Still linked to the mixture creation line.

I’m totally clueless. I don’t understant why this doesn’t work : a DensityDist is a distribution after all, why does the mixture does not accept it ?

The error happens earlier actually:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
~/Documents/Github/pymc3/pymc3/distributions/mixture.py in _comp_means(self)
    118         try:
--> 119             return tt.as_tensor_variable(self.comp_dists.mean)
    120         except AttributeError:

AttributeError: 'DensityDist' object has no attribute 'mean'

I dont think pm.Mixture was designed to handle DensityDist. As a temporally hack(!!), the below seems to run:

with pm.Model() as model:
    mu = pm.HalfNormal('mu', sd=1, shape=2)
    sd = pm.HalfNormal('sd', sd=1, shape=2)
    g = pm.Normal.dist(mu=mu, sd=sd, shape=2)
    
    comp = pm.DensityDist('comp', logp=g.logp, random=g.random, shape=2)
    comp_dists = comp.distribution
    comp_dists.mean = comp_dists.mode = np.array([1, 2])
    
    w = pm.Dirichlet('w', a=np.array([1, 1]))
    mix = pm.Mixture('mix', w=w, comp_dists=comp_dists, observed=data)
2 Likes

Thanks a lot !!

But I must confess that I am so puzzled by this answer…

Why does the mixture needs mean & mode of the components ?
Does it really use this information in the computation ?

Does this follwing alternative solution make more sens than your : ?

comp_dists.mean = comp_dists.mode = g.mean

Since your solution ‘injects’ the solution ([1,2]) which is what I’m looking for…
I would not have this information for a real data case.

I dont think this was used for computation, but somehow the current implementation is requiring these information.

Yes this should work - the [1, 2] is just a placeholder (hence I called it a hack…), so any number or tensor should work.

2 Likes

ok, can we hope that this will be fixed in a future version of pymc ?

1 Like