How to plot the predict results in v5

I want to plot ,export, and save the result of predictive of my observations,I know how to do this in v3.I use

with model_reg:
#    pred_samples = pm.sample_posterior_predictive(f_pred)
    pred_samples = pm.sample_posterior_predictive(trace_reg, var_names=['f_pred'])

to get the prediction, then I want to plot f_pred, like this


what should I do?

What were you doing in v3 to make plots? The arviz API hasn’t changed much (at all?), so my guess is you just need some pointers related to working with xarrays. It would be good to see your starting point.

I want to plot the predicted samples f_pred at X_new points and error bar, in v3,I used

X_new = np.linspace(np.floor(x.min()), np.ceil(x.max()), 100)[:,None]
with model_reg:
    f_pred = gp.conditional('f_pred', X_new
with model_reg:
    pred_samples = pm.sample_posterior_predictive(f_pred)
_, ax = plt.subplots(figsize=(12,5))
ax.plot(X_new, pred_samples.posterior['f_pred'].T, 'C1-', alpha=0.3)

And I got

When change my code to v5,I think I need to change this part:

pred_samples.posterior[‘f_pred’].T,
but I’m not sure what to do.

You have only partial code, and the plotting code you posted will not make the plot you posted (there is nothing to get the HDI, draw sample trajectories, etc).

Here is a simple model and a plot of a conditional distribution. Perhaps it will be helpful?

# Generate fake data
rng = np.random.default_rng(123)

true_lengthscale = 0.2
true_eta = 2.0
true_cov = true_eta**2 * pm.gp.cov.ExpQuad(1, true_lengthscale)
# Add white noise to stabilise
true_cov += pm.gp.cov.WhiteNoise(1)

X = np.linspace(0, 2, 200)[:, None]
true_K = true_cov(X).eval()

y = pm.draw(
        pm.MvNormal.dist(mu=np.zeros(len(true_K)), cov=true_K, shape=true_K.shape[0]), draws=1, random_seed=rng)


# Fit GP
with pm.Model() as m:
    ls = pm.Gamma('ls', 2, 1)
    eta = pm.Exponential('eta', 1)
    cov = eta ** 2 * pm.gp.cov.ExpQuad(1, ls)
    
    gp = pm.gp.Marginal(cov_func=cov)
    sigma = pm.Exponential('sigma', 1)
    y_hat = gp.marginal_likelihood('y', X=X, y=y, sigma=sigma)
    idata = pm.sample(nuts_sampler='numpyro')

# Sample conditional distribution from new data
Xnew = np.linspace(0, 2, 100)[:, None]

with m:
    f_star = gp.conditional("f_star", Xnew=Xnew)
    idata = pm.sample_posterior_predictive(idata, var_names=['f_star'], predictions=True)

# Plot the results
fig, ax = plt.subplots(figsize=(14,4), dpi=144)
ax.plot(Xnew.ravel(), 
        idata.predictions.stack(sample=['chain', 'draw'])['f_star'].values,
        alpha=0.1, 
        c='0.5')
ax.plot(X.ravel(), y, c='k')
plt.show()

Thanks very much, it’s helpful. Besides, when I use gaussian process, it takes a very long time to finish the process, if there any method can make it faster?

You can try an HSGP approximation, see this thread and the linked PyMCon videos for all the details.