# Gamma regression

Hi everyone,

I’m have a 2D dataset containing a linear relationship + gamma noise, like what follows:

``````import numpy as np

length = 200

rng = np.random.default_rng(1337)
x = np.linspace(1e-2, 1, num=length)
true_regression_line = 5 * x + 4
y = true_regression_line + rng.gamma(0.5, 0.2, size=length)
``````

I can fit an ordinary linear regression with Gaussian noise with the following code:

``````with pm.Model() as guassian_model:
sigma = pm.HalfCauchy("sigma", beta=10)
intercept = pm.Normal("intercept", 0, sigma=20)
slope = pm.Normal("slope", 0, sigma=20)

likelihood = pm.Normal("y", mu=intercept + slope * x, sigma=sigma, observed=y)

map_estimate = pm.find_MAP()
``````

resulting in the following fit:

So far so good. I’m now trying to improve the fit by recognising that the noise is Gamma-distributed, and replacing the Gaussian likelihood from the model above. I found the following code snippet that claims to achieve this:

``````with pm.Model() as gamma_model:
a = pm.Normal("a", 0, 0.5)
b = pm.Normal("b", 0, 0.5)
shape = pm.Uniform("shape", 0, 50)
μ = pm.math.exp(a + b * x)

likelihood = pm.Gamma("y", alpha=shape, beta=shape / μ, observed=y)

map_estimate = pm.find_MAP(progressbar=False)
``````

which results in the following fit, which I find a bit disappointing, although I may be missing the point:

I was expecting a line closer to the true regression line, and suspect that there’s something odd with `gamma_model`, as the Gamma noise must be >0 and that doesn’t seem to be the case with the fitted line in the plot. In all honesty, I don’t follow all the details of `gamma_model`—is it using the appropriate link function for the Gamma distribution?

Thanks for your time, any tips appreciated!

`pm.Gamma` offers a parameterization using `mu` and `sigma`, so I would just replace your:

``````likelihood = pm.Normal("y", mu=intercept + slope * x, sigma=sigma, observed=y)
``````

with

``````likelihood = pm.Gamma("y", mu=intercept + slope * x, sigma=sigma, observed=y)
``````

Your priors might not be quite what you want, but it might get you close.

Thanks for the response.

I’ve tried the parametrisation that you suggest and I’m getting exactly the same result as I posted above.

My intention is that, by recognising that the noise is actually gamma-distributed, I would get a line fit that is closer to the bottom of the scatter plot, rather than the centred Gaussian fit, where the noise is allowed to be both positive and negative. Perhaps my intuition is completely wrong, but I’ve seen some evidence of this working in STAN, for example: Extend model to other distributions by rostyboost · Pull Request #1492 · facebook/prophet · GitHub.

A gamma GLM models the expected value of a gamma distribution as a linear function, but the gamma function itself is non-linear, so you end up with that curving in the data (due to the exponential link function). I guess what you want is a fixed mean with gamma noise, which is different. You could try something like this:

``````import pymc as pm
import arviz as az

def gamma_noise_regression(mu, alpha, beta, size=None):
return mu + pm.Gamma.dist(alpha=alpha, beta=beta, size=size)

with pm.Model() as model:
intercept = pm.Normal('intercept', sigma=10)
slope = pm.Normal('slope', sigma=10)
mu = intercept + slope * x

alpha = pm.Exponential('alpha', 1)
beta = pm.Exponential('beta', 1)

y_obs = pm.CustomDist('y_obs', mu, alpha, beta, dist=gamma_noise_regression, observed=y)
idata = pm.sample(init='jitter+adapt_diag_grad')
idata = pm.sample_posterior_predictive(idata, extend_inferencedata=True)
``````

I’m using the powers of PyMC’s automatic logp inference to turn `mu` into the location parameter of a generalized gamma, and using that to model the data. I had to add some normal noise to `mu` when I generated the data, otherwise it was divergence city (this is common in models with noiseless components). I got this as my output:

3 Likes

Ahh, that’s exactly what I wanted. Many thanks!

If I may ask a follow-up question, how would you go about fitting the same sort of regression with Gamma noise, but if the spikes went down, rather than up?

I’ve tried the following modification to `gamma_noise_regression()`, where I subtract the Gamma noise:

``````import pymc as pm

def inverse_gamma_noise_regression(mu, alpha, beta, size=None):
return mu - pm.Gamma.dist(alpha=alpha, beta=beta, size=size)

with pm.Model() as model:
intercept = pm.Normal('intercept', sigma=10)
slope = pm.Normal('slope', sigma=10)
mu = intercept + slope * x

alpha = pm.Exponential('alpha', 1)
beta = pm.Exponential('beta', 1)

y_obs = pm.CustomDist('y_obs', mu, alpha, beta, dist=inverse_gamma_noise_regression, observed=y)
idata = pm.sample(init='jitter+adapt_diag_grad')
idata = pm.sample_posterior_predictive(idata, extend_inferencedata=True)
``````

but I always run into sampling errors:

``````Starting values:
{'intercept': array(0.), 'slope': array(0.)}

Logp initial evaluation results:
{'intercept': -2.53, 'slope': -2.53, 'y': -inf}
You can call `model.debug()` for more details.
``````

Many thanks!

When the logp of `y` is `-inf`, it means that at your starting values (intercept = 0, slope = 0), you have data that could not possibly have been generated by your model. In this case, I suspect you have positive values of y. I suggest you do some prior predictive simulations and make sure that your priors imply a models that could plausibly have generated your data.

You can also just force it to work by manually specifying the sampler’s initial values for `y` using the `initval` argument, see here. I don’t recommend this, though – if you need to manually set starting values you probably have other problems that need to be addressed.

Thanks for the detailed answer, will look into your suggestions!