Thanks for the responses during the QA @bwengals and @fonnesbeck! Using your advice, I got pyMC GP functionality to work with custom distance matrices - it was as simple as subclassing covariance kernels and getting them to use the precomputed covariance matrix instead of doing distance calculations yourself. It feels “hacky”, but it works when getting the distance function written as a kernel is difficult. Here’s the notebook with examples, in case you’re interested: stat_rethinking_2023/GPs-as-prior-for-parameter (islands & tools).ipynb at main · kamicollo/stat_rethinking_2023 · GitHub
While trying to replicate Statistical Rethinking examples fully, there was one other issue that I came across with that I was not able to resolve. How would I go about capturing the covariance matrix computed by the kernel in the trace? I thought simply doing this would work - but it does not; the trace contains an identity matrix instead.
with pm.Model():
...
cov_func = pm.gp.Exponential(input_dim=1, ls=ls)
gp = pm.gp.Latent(cov_func = cov_func)
gp.prior(X=myVariable)
pm.Deterministic(cov_func(myVariable))
....
How would I go about making the above example work? I know I can simply replicate the formula of the Exponential kernel myself, but I wondered if there’s a better way (especially for situations where the kernel is more complex, e.g. additive).
Thank you!