I am very sorry, I will reply you so late.
I use pm.DensityDist and the dimension of the model is reduced to one-dimensional data, then the experiment is re-run.
But there was an error.
Here is my code:
K = 8
n_samples = N
from pymc3.math import logsumexp
def KMM_logp(weight, aj, bj):
def logp(value):
Ncomps = 8
logps = []
for i in range(Ncomps):
a = aj[i]*bj[i]
b = aj[i]*(1 - bj[i])
result = tt.gammaln(a + b) - tt.gammaln(a) - tt.gammaln(b) + (a - 1)*tt.log(value) + (b - 1)*tt.log(1 - value)
logps.append(tt.log(weight[i]) + result)
return tt.sum(logsumexp(tt.stacklists(logps)[:, :n_samples], axis=0))
return logp
with pm.Model() as model:
alpha = pm.Gamma('alpha', 1., 1.)
beta = pm.Beta('beta', 1., alpha, shape=K)
w = pm.Deterministic('w', stick_breaking(beta))
Phi = pm.InverseGamma('Phi', alpha=1, beta=1)
Delte = pm.Exponential('Delte', lam=1/8)
Mu = pm.InverseGamma('Mu', alpha=1, beta=1)
Eta = pm.Uniform('Eta', lower=0.0, upper=1.0)
aj = pm.InverseGamma('aj', alpha=Phi, beta=Delte, shape=K)
bj = pm.Beta('bj', alpha=Mu*Eta, beta=Mu*(1-Eta), shape=K)
obs = pm.DensityDist('obs', KMM_logp(w, aj, bj), observed=X)
I use MCMC to inference, the error is:
AttributeError: Can't pickle local object 'KMM_logp.<locals>.logp'
while I use variational inference, no error, but has a bad result.
