Bambi MUCH faster then my pymc implementation of Negative Binomial regression

Since the link function for the Negative Binomial likelihood is exponential, you have to be a bit careful with your priors. It’s very easy to blow up the sampler by ending up values that are too large, leading to underflows and bad times. Basically, the quantity b0 + b[0] * Y + b[1] * H needs to be less than about 700. This also means also need to think about the scale of your data, Y and H.

So the two dangers that jumped out to me in this regard were 1) the constant intercept has a standard deviation of 500, and 2) the value of “years” is in thousands, so something like 0.4 * 1000 = 400 is already quite large. Two easy fixes, then, are to reduce the standard deviation of the constant, and to re-scale “years” so it counts from 0 to 1 rather than from 1981 to 2020 (since it’s just a deterministic trend anyway). Here is what I changed:

    b0 = pm.Normal("Intercept", mu=0, sigma=100)
    b = pm.Normal("slopes", mu=0, sigma=0.4, dims="regressors")
    alpha_p = pm.Exponential("alpha", lam=1/50)

    Y = pm.ConstantData("year", (eagles.year - eagles.year.min()) / (eagles.year.max() - eagles.year.min()), dims = "obs_idx")

After these changes the model samples in ~10 seconds without divergences.

4 Likes