Use saved gaussian model from sci-kit in pymc3?

Hi @naitikshukla,
(This is the same as Finding posterior for calibration using saved Gaussian model in pymc3 right? sorry about the non-response).

What you want to do can surely be done in PyMC3. However, I would suggest you to build the GP in PyMC3 and calibrated there instead, so you can perform the inference in a coherent framework.

Otherwise, if you instead still want to use the fitted GP from scikit-learn, what you can do is isolate the parameters from the fitted GP, namely the mean and standard deviation (or standard error) of Q1, Q2, Q3. And instead of using a Uniform, use a Normal distribution to define it in the pm.Model:

with pm.Model() as model:
    Q1 = pm.Normal('Q1', mu_q1, sd_q1)
    Q2 = pm.Normal('Q2', mu_q2, sd_q2)
    Q3 = pm.Normal('Q3', mu_q3, sd_q3)
    gp = ... # use Q1, Q2, Q3 to build a GP in pymc3, and calibrate it using the new observation
             # more details in http://docs.pymc.io/notebooks/GP-Marginal.html

I guess Q1, Q2, Q3 is the parameters of the kernel function, so similar to

with pm.Model() as gp:
    ℓ = pm.Gamma("ℓ", alpha=2, beta=1)
    η = pm.HalfCauchy("η", beta=5)
    cov = η**2 * pm.gp.cov.Matern52(1, ℓ)
    gp = pm.gp.Marginal(cov_func=cov)

    σ = pm.HalfCauchy("σ", beta=5)
    y_ = gp.marginal_likelihood("y", X=X, y=y, noise=σ)

in your case it will go like this:

with model: # the model you define above with the Q1, Q2, Q3
    cov = Q1**2 * pm.gp.cov.Matern52(1, Q2)
    gp = pm.gp.Marginal(cov_func=cov)
    y_ = gp.marginal_likelihood("y", X=X, y=y, noise=Q3)

of course you need to make sure the Q1, Q2, Q3 in this case correspondent to the right input parameter for pm.gp.

Let me know if there is anything unclear :slight_smile:

1 Like