Sampling from GP Posterior with fixed Hyperparameter

n = 200
X_new = np.linspace(0,100,n)[:, None]    
y_true = np.random.multivariate_normal(mean(X_new).eval(),cov(X_new).eval() + 1e-8*np.eye(len(X_new)), 1).flatten()
noise_obs = 0.5
X_obs = X_new
y_obs = y + noise_obs * np.random.randn(n)    

with pm.Model() as model1:
            lengthscale = 2
            signal_radial = 2
            noise_obs = 0.5
            covariance = signal_radial**2 *, lengthscale)
            mean =
            gp =, cov_func=covariance)
            y = gp.marginal_likelihood("y", X=X_obs, y=y_obs, noise=noise_obs)
            f = gp.conditional("f",X_new)
            mp = pm.find_MAP()        
            pred_samples = pm.sample_ppc([mp], vars=[f], samples=500)

Why is this very slow? I understand there are no free variables and MAP is not very good. But what would be the right way to do it?

You should move f = gp.conditional("f",X_new) away from the model, and add it after inference. It is not needed during sampling, but would significantly increase the computation.

1 Like

Hi thanks for your comment. You’re right but the model needs some free variable to sample and that is why I have f there inside the block. Is there any other way of sampling from GP posterior?

On separating it from the model I get

error: failed in converting 4th argument `u’ of _lbfgsb.setulb to C/Fortran array

If I used NUTS (pm.sample() instead of using MAP estimate) instead, it’s still slow34 than normal at 4.04 it/sec

I found a work around to my problem (by setting s.d. too small) but would be still interested to know if there is a better way.

Oh I see you want to sample from the conditional likelihood. Did you try sample_prior_predictive?

1 Like

If I were you, I’d call the hidden method _build_conditional directly. Each GP implementation has it, and it returns the mean and covariance as theano variables. Check out what inputs it needs. Then, you should be able to just do mu.eval() and cov.eval(). Since there are no unknown parameters, you are pretty much just working directly with theano here.


@junpenglao @bwengals Thanks, I will try both of them.