I have a reference notebook for this question.
As you can see I have a hierarchical model, where a Dirichlet process is the prior for a weight distribution of a gaussian mixture model.
When I calculate the gradient of logp_elemwiset wrt beta, it returns me a value independent of the alpha value.
This is a bit puzzling, because if as listed here https://docs.pymc.io/theano.html in the section “How pymc3 uses Theano?”, the logp value of a variable includes its prior. Then the prior of the beta logp value is a function of alpha. In this model, alpha is a free rv, so beta.logp should return a missing input error.
Even if beta is specified, the prior logP(beta), in logp(beta) as defined by:
It is a function of alpha.
Can someone explain what value of alpha is being assumed under the hood?