I’m working with a Multivariate Normal likelihood function. I seek to model observations as uncorrelated and therefore the covariance matrix should simplified to just the individual variable’s point wise variance along the diagonal.
I’d like to specify individual variable variance as hyper-priors with variable specific variances. How does one best construct a covariance matrix like this?
Here’s some example code for reference:
#Define Forward model (wrapped through theano)
with model:
R, A, M = PDD_forward.forward(z_obs,
f_s_prior,
C_prior,
f_r_prior,
grad_a,
A_m_prior)
#Observation matrix (Mx3)
mu_obs = np.array([R_obs, A_obs, M_obs]).T
#"Prediction" matrix (Mx3)
mu_pred = tt.transpose(tt.stack([R, A, M]))
#Unclear how to formulate covariance mat
# cov = ???
#likelihood function
vals = pm.MvNormal('vals', mu=mu_pred, cov=cov, observed=mu_obs)
If my observation matrix is M \times 3, then what shape should the (uncorrelated) covariance matrix be?
For reference, I’m trying to mimic the likelihood function (Eqn. 13) from this paper (in my case tri-variate instead of b-variate) .
If I can provide any more info or clarify anything, please let me know!
Thanks in advance.
Best,
Andrew