Using result of Gaussian process within a different model

Hi everyone!

I’m a bit new to PyMC (and to Gaussian processes in general), so sorry in advance for some imprecision.

I have a function that depends on three parameters f(x, \alpha, \beta), that will be part of my model. However, I only have access to its values on a discrete set of (x, \alpha, \beta) points. My idea is to perform an interpolation via Gaussian processes. In this way, whenever I need the value of f I would make a sampling of the Gaussian process.

My question is then: after having done the “interpolation”, how can I sample it within a different Bayesian model? Something like:

with pm.Model() as model:
    x = pm.Data('x', X, mutable=True)
    # Prior
	alpha = pm.Uniform('alpha', 0, 20)
	beta  = pm.Uniform('beta', 0, 20)
	sigma = pm.HalfNormal('sigma', 10)
	# Part from GP (what I don't know how to do...)
	prms_new = at.stack((x, alpha, beta)).T
	with gp_model:
		f_pred = gp.conditional('f_pred', prms_new)
	y = pm.Deterministic('y', f_pred)
	# Likelihood
	pm.Normal('obs', mu=y, sigma=sigma, observed=Y_obs)
	trace = pm.sample()

I’ve come across similar questions in other threads, but if this question was answered before, I’m afraid I didn’t understand it…

Thanks in advance!

It possible to combine both models into one larger model? You’d have to fit the GP within your 2nd model. I think this would be the most straightforward – you’d call conditional like you did and the code should work as you’d expect.

Breaking it into two separate models, you’d have to do this but specialize it for GPs. Haven’t tried doing it though! I know it’s been a couple weeks since you posted, but would be curious to hear if you tried this and it works.

1 Like

Hi @bwengals. Thanks for the reply!

I have tried to join both models in a larger one. This works fine for a simple model. In the end though, I would need to perform inference using predictions from two different Gaussian processes at the same time. Is it a good idea to have two Gaussian processes plus Bayesian inference inside the same model and have it run everything at the same time?

As for the your second suggestion, I’m currently trying to implement it. I’ll share if it works (or not).

1 Like

The only reason not to is speed I think, but that’s usually a big reason