Hi folks,

I’m fitting a periodic timeseries using a Gaussian process with a linear mean and Periodic covariance function:

alpha = pm.Normal(‘alpha’,0,1000) # Intercept
beta = pm.Normal(‘beta’,0,1000) # Slope
sigmasq = pm.Gamma(‘sigmasq’,.1,.1) # Variance
phi = pm.Gamma(‘phi’,1.,1.) # Lengthscale
cov =,365.,phi)

In terms of fitting, should there be a difference between:

gp =
s = gp.prior(‘s’,X=X)
eta = alpha + beta*X + s
y_ = pm.Binomial(‘y’, N, pm.invlogit(eta), observed=y)


mean_f =, intercept=alpha)
gp =, cov)
eta = gp.prior(‘eta’,X=X)
y_ = pm.Binomial(‘y’,N,pm.invlogit(eta), observed=y)


NUTS has trouble with the first option, but not the second. Digging through the PyMC3 source code, it looks as though they should be identical specifications in terms of the shape of the posterior…

Any thoughts?



In terms of mathematical formulation they seems to be the same, but it could be the case that in the second model, the Cholesky decomposition of the GP accounted already for the linear equation, which makes the computation more efficient.

1 Like