I am using a Latent Gaussian Process with Negative Binomial likelihood. I am running into an issue where sampling from the posterior gives almost identical f values in all dimensions (around ±35, which is the square root of the mean value of my observed dataset). I understand that each dimension of f should converge over time, but I don’t think that every dimension should be the same. Any ideas why this is happening?
Here is my code and the trace plot it produces:
with pm.Model() as model: a = pm.TruncatedNormal('amplitude', mu=1, sigma=10, lower=0) l = pm.TruncatedNormal('time-scale', mu=10, sigma=10, lower=0) cov_func = a**2 * pm.gp.cov.ExpQuad(input_dim=1, ls=l) gp = pm.gp.Latent(cov_func=cov_func) f = gp.prior('f', X=t) alpha = pm.TruncatedNormal('alpha', mu=500, sigma=500, lower=0) y_ = pm.NegativeBinomial('y', mu=tt.square(f)+1e-6, alpha=alpha, observed=y) trace = pm.sample(500, chains=1, tune=1000, target_accept=.90)