Bayesian calibration on GP model

Thank you for the link. I had already seen this post but couldn’t quite find what was missing in my understanding. After going over it again, I think what I was looking for was the gp.marginal_likelihood function, that “tells” that some values were observed. Giving a free variable to the X argument would then allow to recover it, which is my objective.

However, I now struggle on shapes and theano conversion. In my toy example, I have a 1D GP, with a scalar observed value and a scalar parameter theta to recover. It seems that I can’t find how to properly give theta to the gp.marginal_likelihood function. Either the number of dimensions is too low or theano doesn’t want object typed variables.

Here is the example:

import numpy as np
import pymc3 as pm
import theano.tensor as tt

def f(x):
    return np.sin(10*x)

theta_train = np.linspace(0,1,7)
theta_true = 0.8

f_train = f(theta_train)
f_true = f(theta_true)

with pm.Model() as model:
    
    theta = pm.Uniform("theta", lower = 0, upper = 1)
    sigma = pm.HalfNormal("sigma", sd = 1)
    l_scale = pm.Gamma("l_scale", alpha= 1, beta = 5, shape = 2)
    s_n = pm.HalfNormal("s_n", sd = 1)
    
    cov = pm.gp.cov.Matern52(1, l_scale)
    
    gp = pm.gp.Marginal(cov_func = cov)
    
    theta_all = np.concatenate((theta_train,[theta]))
    f_all = np.concatenate((f_train,[f_true]))
    
    y1 = gp.marginal_likelihood("y1",
                                X=theta_all.reshape(-1,1),
                                y=f_all.reshape(-1,1),
                                noise=s_n)

Again I’m sorry if the answer only relies on basic theano shape handling.
Thank you for your lights !