Implementing an ExGaussian with a complementary error function

Hey fellow Pymc3’ers :wave:

I have been working on implementing an Exponentially modified Gaussian distribution (ExGaussian https://en.wikipedia.org/wiki/Exponentially_modified_Gaussian_distribution) as part of a model i am doing. For the ExGaussian part of the model i need the complementary error function erfc, which theano seems to have implemented

ExG(x;m,sigma,lamb)=(lamb/2.0)*tt.exp(lamb*(m +(lamb*sigma**2)/2.0-x))*tt.erfc((m+lamb*sigma**2-x)/(tt.sqrt(2.0)*sigma))

But when running the full model i observe some strange behavior. Initialising with Advi returns NaN on optimization, but i checked model.check_test_point() and there seems to be nothing odd like NaN or inf there. Running the model with auto or adapt_diag works, but every single sample after tuning is divergent, though surprisingly the testcase i have created can sample the model parameters fairly well.

When searching for models that have used erfc (and erf) it seems there is a problem with the gradient calculation?

I know Pymc3 has an ExGaussian implemented, but as far as i know continuous distributions are for RV’s and not to be used as functions defined over a linear array of x points.

Does anyone have more info on whether or not erf and erfc works in Theano/Pymc? I would be happy to share my full model if it helps, i just think i have narrowed it down to being related to the error function, since i ran a similar model using a lognormal before and did not see these issues with divergences.

Thanks for the help :smile: