Hi There,
Over the last few weeks I’ve been trying to translate a (somewhat cryptic) paper from the 80’s and implement a version of the problem in PyMC3 and leverage of variational inference but I am somewhat stuck and would love it if someone could point me in the right direction.
The problem is to, using bayes theorem, estimate theta ~ MVN(mu,cov=V).
We have prior estimates of mu and V but also (and more importantly) want to incorporate prior information of another p length vector g’s relationship to each of the n values of mu. This relationship is explained via a p x n matrix H who’s values represent the proportion of each value mu that makes up this row’s value of g. So that each observation g equals the corresponding row sum of H * mu (element wise). In addition the uncertainty and covariance of the observations g were also specified by sigma and hence this prior information was given the term Y ~ MVN(H * theta, cov = sigma)
The paper gives a toy problem and solution. This is what I’ve got so far to try and replicate it.
mu0 = np.array([5.9433962,
5.9433962,
5.9433962,
5.9433962,
5.9433962,
5.9433962])
V0=3.53*np.identity(6)
H=np.array([0,0,1,0,1,1,1,1,1,0,0,0,0.3,1,1,0,0,0,0.3,0,0,1,1,0])
H.shape = (4,6)
sigma=np.array([7.04,1.41,1.65,0.65,1.41,2.84,1.25,-0.60,1.65,1.25,1.60,0.10,0.65,-0.60,0.10,0.50])
sigma.shape=(4,4)
with Model() as Maher_model:
prior = MvNormal('prior', mu=mu0, cov=V0, shape=6)
likelihood = MvNormal('likelihood', mu=np.dot(H,mu0), cov=sigma, shape=4, observed=g)
samples2 = fit(random_seed=RANDOM_SEED).sample(1000)
Their answers are posterior mu = (9.88, 3.43, 4.75, 5.09, 6.41, 7.27).T and the variances of V are (1.64, 1.76, 1.88, 1.85, 1.76, 2.45).T
I realised pretty quickly that there a few big gaps in my knowledge:
- how would I represent these nested observations i.e. g and the way in which g relates to my posterior?
- how do I incorporate prior information when it is a complete representation of the posterior distribution? All the examples I’ve seen seem to just put priors on the parameters of the likelihood.
Sorry if these are stupid questions - I really appreciate any support or advice I can get.
Mitch