Avoiding looping when using GP prior on latent variables

That’s an interesting idea @ckrapu. I will explore it further.

Here’s another idea, but one that takes us out of using the nice aspects pm.gp.

with pm.Model() as base_model:
    
    cov_gamma = 1.0 * pm.gp.cov.ExpQuad(input_dim=1, ls=1.0) + pm.gp.cov.WhiteNoise(1e-6)
    cov_gamma_eval = cov_gamma(W) # W is a theano.tensor containing input W of size (52, )
    gamma_all = pm.MvNormal("gamma_all", mu=np.zeros(len(W)), cov=cov_mu_cov, shape=(len(I), len(W)))

In this scenario, I use the shape argument of the MvNormal to generate len(I) samples of size len(W) (Multivariate Normal density is on a random vector of size len(W)).

Thoughts on this?