Custom likelihood from a python function

Hello,

I have generated some data D (blue dots) from the non-linear model (red) that takes in a set of model parameters M=(m1,m2,…), where I want to estimate the posterior for, for example m1 and m2: P(m1,m2|D).

image

How should the likelihood function be defined without a simple expression?

Should the following code in principle be valid? (the “bg” refers to another library for the model I use):

K_data = ... # The "K-noisy" data in the above plot
with pm.Model() as model_cem:
    phi = pm.Data('phi', phi_data, mutable=True)

    # Define priors
    m1 = pm.Normal("Kf", mu=0, sigma=1)

    # Standard deviation - Note HalfNormal!
    s = pm.HalfNormal("sigma", sigma=1)

    def testfunc(phi,m1): # This function calls on the model used to generate K_data
      K_d_cem,MU_d_cem = bg.rockphysics.contact_cement(K_qz, MU_qz, phi, phi_c=phic, Cn=Cn, Kc=K_qz, Gc=MU_qz, scheme=1)
      return bg.rockphysics.fluidsub.vels(K_d_cem,MU_d_cem,K_qz,RHO_qz,x1,RHO_b,phi)[-1]
    
    # Define the likelihood
    likelihood = pm.Normal("y", mu=testfunc(phi,m1), sigma=s, observed=K_data)

    step = pm.NUTS()

    trace = pm.sample(1000, tune=500, init=None, step=step, cores=2)
    

Thanks in advance,
Kenneth

1 Like

Hi, at a glance, I think i have had to do something similar to this. I found the “custon likelihood” tutorial in the docs very useful. Not sure how easy it would be to get the gradient of testfunc, so if you want to just get things running, you may need to use a metropolis hastings sampling step rather than NUTS. Hope this helps

Here is the custom pytensor Op documentation that @BrynWalton mentioned.

It might be possible that you don’t need to do this. It depends entirely on what is happening inside of contact_cement and fluidsub.vels. If they only do vectorized numpy-style operations (that is, no control flow, no loops, no exotic linear algebra), you can use the function out-of-the-box by just passing symbolic tensors as inputs. If there is some optional control flow (perhaps related to the scheme argumnet?) you could rip out only the bits you need and re-implement it in native pytensor. Making an Op should be your last resort.

1 Like

Hello !
I have a similar problem, I would like to use the result of an FMU (Fuctional Mockup Unit) calculation as the mean of my likelyhood function. I already wrapped the FMU in a python function, but it doesn’t accept Pytensors.

I think this is similar to this discussion:

I tried to find the tutorials listed here :
“Specifically, the tutorial on using a blackbox likelihood function, the tutorial on custom distributions, the tutorial on wrapping jax functions into PyTensor Ops.”
from Likelihood Approximations with Neural Networks in PyMC - PyMC Labs

The one you mention @jessegrabowski is among them, but all the links seem to be broken.

Do you have any idea where I should look ?

Thank you !

Please find the correct links here:

1 Like

Thank you !

Using a “black box” likelihood function is particularly usefull.