# Converting Edward Model to Pymc

I am very new to PYMC (and probabilistic programming) and I’m trying to follow a tutorial made for Edward:
https://www.ritchievink.com/blog/2018/06/05/clustering-data-with-dirichlet-mixtures-in-edward-and-pymc3/

``````k = 3  # number of clusters

d = df.shape[1]
n = df.shape[0]

pi = ed.models.Dirichlet(tf.ones(k))
mu = ed.models.Normal(tf.zeros(d), tf.ones(d), sample_shape=k)  # shape (3, 4) 3 gaussians, 4 variates
sigmasq = ed.models.InverseGamma(tf.ones(d), tf.ones(d), sample_shape=k)
x = ed.models.ParamMixture(pi, {'loc': mu, 'scale_diag': tf.sqrt(sigmasq)},
ed.models.MultivariateNormalDiag,
sample_shape=n)
z = x.cat
``````

I especially do not understand how to create those 2D shaped Distributions. (As far as I understand those are not multivariate, or am I wrong?)

``````t = 500  # number of samples

qpi = ed.models.Empirical(tf.get_variable('qpi', shape=[t, k], initializer=tf.constant_initializer(1 / k)))
qmu = ed.models.Empirical(tf.get_variable('qmu', shape=[t, k, d], initializer=tf.zeros_initializer()))
qsigmasq = ed.models.Empirical(tf.get_variable('qsigmasq', shape=[t, k, d], initializer=tf.ones_initializer()))
qz = ed.models.Empirical(tf.get_variable('qz', shape=[t, n], initializer=tf.zeros_initializer(), dtype=tf.int32))

inference = ed.Gibbs({pi: qpi, mu: qmu, sigmasq: qsigmasq, z: qz},
data={x: y})
``````

How would the inference in pymc look like?

I hope that’s not too much to ask. It would help my learning process a lot.

1 Like

Welcome!

I might suggest looking through the Gaussian Mixture Model notebook, which has something very similar but written for version 4 of PyMC (and makes use of the now-default coords/dims rather than shapes). If that doesn’t help, let us know and someone can help get you what you need.

1 Like