@NateAM Sorry for the late response, I went on break just as you responded.
I looked over and tried the different options you suggested and decided that I am just going to recreate the GP model in PyMC3. Just as a refresher, I am trying to do level 1 inference on the (13) parameters of my data and I do not really care about the distributions of my hyperparameters. I have a couple of questions and just want to see if I am heading in the right direction with my code.
In your past tutorial comments, you have used a custom likelihood function for Bayesian inference. I am using the following likelihood function:
so I am using the MvNormal(Normal for testing) likelihood function. I have already built my GP model in pymc3 but am having problems with the Bayesian inference side. I have the following model:
with pm.Model() as bamodel:
var1 = pm.Uniform('rhoe', lower=16.0, upper=18.0, testval=16.94)
var2 = pm.Uniform('alpha', lower=15.8, upper=19.2, testval=17.5)
...
var13 = pm.Uniform('var13', lower=6.0, upper= 10.2, testval=8.16)
theta = tt.stack(var1, var2, \
... var13], axis=0).reshape([1, 13])
mean, cov = gp.predict(theta, point=mp, diag=True) # gp.predictt(theta, diag=True)
sigNoise = pm.HalfNormal('sigNoise', sigma=0.5, testval=0.05)
cov_noise = pm.Deterministic("cov_noise", (cov + sigNoise**2))
likelihood = pm.Normal("likelihood", mu=mean, tau=cov_noise, observed=yobs)
trace = pm.sample(1000, chains=2, cores=1, init='adapt_diag', start=start_vals)
I created a separate model for the level 1 inference and would call to the GP model using the predictt function (outputs symbolic mean prediction and cov) but I kept getting a Theano error: “Input 0 of the graph (indices start from 0), used to compute Elemwise…” I have replaced it with gp.predict but this seems to just draw from the prior since the predict function outputs a deterministic value. Is this code in the right direction?
Thank you again for your last comment, every amount of information was extremely helpful.