Just to follow up, I’ve got this working nicely, and actually have been able to stick with using pytensor to implement the integral, using gauss quadrature (which is computationally efficient and sufficiently accurate - for now in this script this is for a single gauss point per element length dc (i.e. is equivalent to trapezium rule) but I will now extend this).
The advantage of this approach versus use of scipy is that I can still define logp such that its derivatives can be evaluated, and hence efficient MCMC methods such as NUTS can be used.
Code example is in the attached percent script. The key ideas being that:
- PDF functions of the X~Normal and C~Beta distributions are defined using
pytensorfunctions, along with PDF for Y=C.X, implementing the above integral. - Log-likelihood is just per its definition, i.e. sum(log(PDF)) format
It can be seen that this is performing well with simulated data. Now to move onto conducting inference on the real dataset…
reduction_factor_test.py (15.1 KB)
Thanks again for this input, much appreciated. Hope this is useful to others wishing to do similar.