Vanilla implementation of Gaussian Process in PyMC3?

Hi there, thanks- this was pretty helpful!

Don’t know if you had the chance to skim through the blog post I linked, but I think I’ve got a pretty decent grasp of the mechanics of GPs from a ~frequentist perspective. The author arrives at the optimal Kernel hyperparamters (L and sigma for the ExpQuad covariance function) by optimization of the log marginal likelihood.

From the SR code snippet and your explanation, I can see that the author chose to supply a Poisson likelihood and correspondingly selected the Latent GP. I have a few thoughts/questions to this end:

  • How is the Bayesian/PyMC3 approach different than the frequentist one? (It seems that rather than optimizing kernel hyperparameters, they’re sampled via HMC.)
  • How does the decision logic to supply your own likelihood (and use the Latent GP) work? (I haven’t come across this idea before.)

Big picture-I’m decently familiar with GPs; the Bayesian/PyMC3 connection seems to be where I’m most confused as log marginal likelihood optimization has either been abstracted away, replaced with something else (say sampling) and in some cases, the likelihood isn’t necessary at all (which is new for me.)