Is there a way to call NUTS directly to sample a customized likelihood together with its gradient?

I have a log-prob function and its gradient. Because FFT is involved in the log-prob function and its gradient, it is not possible to perform autodiff. So I do the differential by hand.

Is there a (documented) way to call the NUTS sampler directly?

I did notice there is a way to create a customized distribution, which however does not mention how to supply the gradient.

Thanks.

The recommanded approach is create a custom Theano Ops with its gradient. You can see a example here: https://docs.pymc.io/Advanced_usage_of_Theano_in_PyMC3.html#writing-custom-theano-ops and a more in complex example here: https://docs.pymc.io/notebooks/blackbox_external_likelihood.html

2 Likes

I am not sure but perhaps this is what you are after Isolate NUTS into a new library

1 Like