Auto-diff when using NUTS sampler with blackbox likelihoods

I want to use the black-box likelihood function with NUTS sampler. Is there a way of using libraries like Autograd for automatic differentiation to get the gradients that the NUTS sampler requires.

In summary, I want to know if there is a feasible implementation of automatic differentiation that enables the NUTS sampler to work with black-box likelihoods.
Tensorflow Probability implements such an automatic differentiation mechanism when using black-box likelihoods.

All of pymc3 is using automatic differentiation. :slight_smile:
We are using theano, so you can use arbitrary theano code in your model. If you wanted to use autograd, you would have to link theano and autograd together. Quite possible, but some work and knowledge of theano and autograd would be required for that.