I know how to use an external model for pymc3. I also know what are derivatives with respect to my model’s parameters.

I want to use NUTS sampler, not Metropolis. But I do not know how to let the NUTS know that derivatives are also available externally?

Any suggestion?

The general route is to wrap the external model in theano, and implement a `grad`

method for it as well.

For example, this post shows you how to call an external model (from scipy) to do numerical integration:

And here is two more examples from @aseyboldt:

@aseyboldt is currently writing a blog post on the later one, but I think you should have enough information to give it a go

@junpenglao I apologize for late reply, and appreciate your help. Please let me explain more:

It is how I define my external model:

```
#%% defining model
@th.compile.ops.as_op(itypes=[t.dscalar],otypes=[t.dscalar,t.dscalar])
def dismodel(E):
f = open("input.xxx", "w")#input for my external program
f.write("set ee "+str(E)+";\n")
f.close()
call("MyexternalProgram final.xxx")#run my external program via python
return np.loadtxt("enddisp.out"),np.loadtxt("ddmdisp.out")
#First output is the value for parameter E, and the second output is its gradient wtr E.
```

Now I do not know how to let NUTS that it is the gradient it should use in its algorithm. I can run the same model using Metropolis without gradient with no problem.

You can not use `@th.compile.ops.as_op`

, as it is just a small wrapper. In this case you have to create a new `theano.Op`

with a gradient method inside. Have a look at the above post, and also the theano doc: http://deeplearning.net/software/theano/extending/extending_theano.html

Thank you. I am working on that and will update here.