PyMC for Bayesian Optimization

Hi Guys,

Have anyone have used PyMC for Bayesian Optimization? If so, are there any documents or tutorial I could look at?

Thanks in advance!

1 Like

You can check out this blog post from @twiecki.

Thanks @cluhmann.

Just to clarify though, Bayesian Optimization usually refers to trying to optimize an objective function using GPs, which this blog post does not address. I don’t think we have any materials on that, but it should be pretty straight-forward to with our GP capabilities.

2 Likes

Thank you twiecki for the information. What I am actually trying to do is to use Bayesian NN instead of GP to perform Baysian optimization due to the nature of my data is not compatible with GP. So I am wondering if you know any sources talking about it? Or have you and other tried using PyMC with some Bayesian OPT platforms such as Ax or Botorch? Thank you!

There’s a few papers that do that, but I don’t know of any software that does it. If you’re using a NN instead of a GP, however, I’m not sure you need a Bayesian NN, in which case you wouldn’t get much benefit from using PyMC.

I’m looking to do something similar, neither using GPs nor Bayesian NNs, and struggle to find an elegant solution. I have a hierarchical forward model of controlled_variables → latent_variables _modulated_by_observed_covariates → observed_outcome in pymc trained using HMC and I want to optimize controlled_variables using some acquisition function, say Expected Improvement. Pymc/ Aesara should have everything I need for that under the hood, but I wondered whether I am missing a simple trick to make best use of it with a defined model and a trace of MCMC samples.

Hey!
I have a current problem which sounds very similar @ssteinuk, did you come up with something?

Thank you!
Best
N

Could you expand on the problem? The descriptions in this post are a bit too vague to provide actual advice

Hi @ricardoV94 , thanks for replying!

First, I’m not interested in using BO to optimize the parameters of a Neural Network (which appears to be a common use case), but something similar in spirit to the original blogpost by Thomas (and what I think is related to what Sebastian had in mind)

I want to optimize the parameters of a black-box model (preferably Bayesian NN) to produce a given output against some (hard) constraints. So let’s say I gathered some data from a machine that produces some output and has multiple adjustable parameters (think temperature, pressure etc.). The inner workings of this machine are black-box, but we can reasonably model the output = f(input) with e.g., a (Bayesian) Neural Network. So If I change the temperature input, I get a good approximation of how the output of the machine would react + the uncertainty in my predictions.

What I would like to do is somehow use the BNN as part of an optimization loop: Let’s say I can only change a selection of parameters in a given range (so there are hard constraints), what would be the optimal combination of parameter changes of my currently available ones to produce a given output + get some idea of the uncertainty of that output. As the BNN is differentiable, this should be possible, but I haven’t seen any examples that go into a similar direction (beyond the blogpost) and I’m glad for any pointers!

Thank you!
Best
N

1 Like

Here is a workflow we have been using at work:

The idea is that we extract the predictive functions from the model definition and replace the RVs by the posteriors. From this we can create arbitrary PyTensor cost functions and get the gradients for optimization as well. Does that seem useful?

Here is a worked example for BO with noisy Expected Improvement acquisition function.

https://github.com/drsstein/pymc_bo/blob/main/pymc_bo.ipynb

6 Likes