Latent Gaussian Process sampling

I am using a Latent Gaussian Process with Negative Binomial likelihood. I am running into an issue where sampling from the posterior gives almost identical f values in all dimensions (around ±35, which is the square root of the mean value of my observed dataset). I understand that each dimension of f should converge over time, but I don’t think that every dimension should be the same. Any ideas why this is happening?

Here is my code and the trace plot it produces:

with pm.Model() as model:
    a = pm.TruncatedNormal('amplitude', mu=1, sigma=10, lower=0)
    l = pm.TruncatedNormal('time-scale', mu=10, sigma=10, lower=0)
    cov_func = a**2 * pm.gp.cov.ExpQuad(input_dim=1, ls=l)

    gp = pm.gp.Latent(cov_func=cov_func)

    f = gp.prior('f', X=t)

    alpha = pm.TruncatedNormal('alpha', mu=500, sigma=500, lower=0)
    y_ = pm.NegativeBinomial('y', mu=tt.square(f)+1e-6, alpha=alpha, observed=y)

    trace = pm.sample(500, chains=1, tune=1000, target_accept=.90)

Hard to tell exactly without seeing your data, but it’s interesting that f is mirrored. What happens if you use tt.exp(f) instead of tt.square(f) as your link function?

I tried with tt.exp(f) and f is now clustered around 7.1, which is the log of the mean of my dataset.

It looks like the model is not attributing much of the variation to temporal correlations in the GP. It might be informative to make plots of the rolling mean (or a smoothed version with bandwidth close to time-scale) of your data across time against the posterior estimates of exp(f). If these two quantities are similar, then there may not be much else you can do without changing some prior assumptions. Also, are you including an intercept term in your GP? It might help us to understand what’s going on if the values of f are clearly partitioned into a mean value and deviations about that mean due to temporal correlations.

2 Likes

I realized I made a stupid mistake… I converted both my x’s and y’s to column vectors using [:,None], but that’s only necessary for the x’s. Thank you anyway for all your help!

1 Like