I am trying to perform inference on the following model to get posterior distributions on some parameters of a black-box function and the length-scale of a gaussian process. The function eta(theta,x) can be any arbitrary black box function, where theta are the parameters to be inferred and x are the input variables.
I have considered two different ways of doing this.
using the marginal gaussian process implementation and creating a custom mean function.
Not using the gaussian process library at all and specifying the appropriate covariance matrix. Then specifying a gaussian likelihood function using the MvNormal function.
I tried the second option and found it quite challenging to construct the covariance matrix. Now I am reading into custom mean functions to try option 1. How do you guys recommend that I tackle this problem?
Thank you so much
@bwengals , after some reading on this forum, I can see you are extremely knowledgeable about Gaussian Processes, so I would be super grateful for your input
Hi, in the mean time, I would suggest you try this notebook, which will show us how to perform GP on a simple dataset. Then you can try it on your dataset, and test with different mean and covariance functions.
In addition, the PyMC examples on GPs could be useful. For example, this note book shows how to define custom mean and covariance functions in PyMC.
Thanks for the resources! I think I was able to define a custom mean function and infer theta, I will post my code here soon. However, I would still like to figure out how to solve the problem without the gaussian process library (option 2) and compare the results to option 1. Since they should result in the same answer, it would be nice to confirm this.
Sorry I’m late, yup, totally agree with @DanhPhan’s suggestions. You can definitely use the GP covariance functions without the any of the other stuff in the GP library, like this:
cov = pm.gp.cov.ExpQuad(1, ls=lengthscale)
Sigma = cov(X)
likelihood = pm.MvNormal('likelihood', mu=mu, cov=Sigma, observed=y)
Is this what you meant?
Yes, this is what I meant. Thanks!