[PyMCon Web Series 03a] Introduction to Hilbert Space GPs (HSGPs) in PyMC (Mar 15, 2023) (Bill Engels)

Not sure, it’ll be part of the next release. As of now, there’s not other PyMC HSGP material out there – working on some for the next event and for the docs. If you want to give it a try sooner, the examples in the docstrings of HSGP.prior, HSGP.conditional and HSGP.prior_linearized

@ScroobiusTrip I wish I’d mentioned this in the talk. You don’t really need to worry about whether something is specifically a mean function if you’re using HSGPs. Just add the (mean zero) HSGP to the piecewise linear part of your model – this is equivalent to using the piecewise linear model as the mean function within a GP. For the rest of the GP implementations in PyMC, you need to at least wrap things into a Mean function object to keep prediction via gp.conditional works correctly.

For your case though, just add them! If you use HSGP.prior_linearized you can use pm.set_data and it’ll work for both parts of the model.

@murphyk

  • It’s \mathcal{O}(m^2 n) to generate the basis (m basis functions, n data points). But, you don’t need to know anything about the kernel hyperparams or even the kernel to do it.
  • No plans now, but the implementation of the tricky parts is here.
  • I think so! f = phi @ (beta * sqrt_psd), with each \beta_i \sim N(0, 1). So you should be able to calculate the marginal likelihood integrating out beta just like one could with a vanilla Bayesian linear regression model. Sorry, I didn’t quite get this live.

@michaelosthege
Does a function like this do the trick?

import pymc as pm
import pytensor.tensor as pt

def one_gp_function(X_new, beta_posterior_sample, cov_func, L, m, X_mean):
    Xs = X_new - X_mean
    L = pt.as_tensor_variable(L)
    eigvals = pm.gp.hsgp_approx.calc_eigenvalues(L, m, tl=pt)
    phi = pm.gp.hsgp_approx.calc_eigenvectors(Xs, L, eigvals, m, tl=pt)

    omega = pt.sqrt(eigvals)
    psd = cov_func.power_spectral_density(omega).eval()
    f = phi @ (beta_posterior_sample * pt.sqrt(psd))
    return f

I’m sure this could be cleaner, but I think one “function” from the GP posterior will correspond to one draw from the posterior of beta. Then you can regenerate the phi basis given arbitrary X and that’s your function.

3 Likes