Hello,

I have a hierarchical regression models that seems to follow a student T distribution but I can’t seem to get a high enough “spike” in the middle nor have the tails spread out as much as the data seems to point to. See my ppc below:

When I facet the ppc by coords, they all seem to follow this pattern. My likelihood is a follows:

```
eaches = pm.StudentT('predicted_eaches',
mu=mu,
sigma=sigma,
nu=15,
# lower = 0,
observed=observed_eaches,)
```

I will admit I’m not sure what `nu`

is doing. Looking at this: pymc.StudentT — PyMC dev documentation, should I be adding a generous sigma to the likelihood?

nu controls the normality or, if you prefer, the “fatness” of the tails. As nu approaches infinity, the Student t approaches a normal distribution. As nu decreases (nu must be > 0), the tails become much fatter than would be expected under a normal distribution.

Whether you adjust sigma or nu depends on what exactly you are trying to achieve and adjusting one likely means adjustments in the “opposite” direction of the other (e.g., fatter tails mean you can shrink sigma). Often (but certainly not always), both sigma and nu are left as free parameters so that the data can determine how this tradeoff is resolved.

Part of the reason for the large spike might be that your data are zero inflated, or cannot be negative. If I try to imagine constraining your posterior predictive to be positive and pile all the area in the negative numbers up at zero, it might look close to the spike in your data? An experiment might be to make this plot again, but clip negative values to zero (it’d be something like a poor man’s Tobit Regression). More formally, you could try using a distribution that respects the zero lower bound, or a mixture of two distributions, one that generates the body of small values, and one that generates the tail values.

This one is definitely zero inflated but not bound at zero. I have some large negative numbers in the target. Is there an example doc using a mixture of distributions in the likelihood?

When I try to leave them as free parameters, I get an error message say mu is a requirement. How do I leave them as free parameters?

`pm.Mixture`

can have an `observed`

argument, like any other distribution, if you want to try that.

By “free parameter”, I think @cluhmann meant to assign a prior to it and let the model estimate. The Stan team recommends a Gamma(2, 0.1) for the prior on `nu`

.

2 Likes