Is there a way to call NUTS directly to sample a customized likelihood together with its gradient?

I have a log-prob function and its gradient. Because FFT is involved in the log-prob function and its gradient, it is not possible to perform autodiff. So I do the differential by hand.

Is there a (documented) way to call the NUTS sampler directly?

I did notice there is a way to create a customized distribution, which however does not mention how to supply the gradient.


The recommanded approach is create a custom Theano Ops with its gradient. You can see a example here: and a more in complex example here:


I am not sure but perhaps this is what you are after Isolate NUTS into a new library

1 Like