How to differentiate (Auto-diff) from log likelihood in pymc3

I am wondering how to take derivative of the likelihood function defined in pymc3 with respect to its parameters. For example consider the following:

with pm.Model() as our_first_model:
     θ = pm.Beta('θ', alpha=1., beta=1.)
     y = pm.Bernoulli('y', p=θ, observed=data)
     trace = pm.sample(1000, random_seed=123)

the third line defines the likelihood and I would like to take the derivative of its log function. Does anyone know any example how this can be implemented. Or how to take the derivative of theta with respect to alpha.
I could not find any example about it in the internet or in the documentation.

Something like (writing from memory):

import theano.tensor as tt

logp_y = y.logp()
dlogp_y_wrt_theta = tt.grad(logp_y, wrt=θ)