Skewed metric, SkewedStudentT attempt

Hi Andras

I don’t have much experience with the skewed T distribution. So I didn’t go into the .logp formula. But am I correct in thinking that when lam=0 then it should reduce to a standard T?

Using your custom distribution when I set the observed to a standard T then I don’t get back the expected values, however I do when I use the built in PyMC3 T.

N, loc, df, scale = 8000, 4, 7, 2
obs_all = loc + scale*np.random.standard_t(df, size=N)

with pm.Model() as ab_model:
    lam = pm.Uniform(f'lam', -0.99, 0.99)  # skew parameter in [-1,1]
    q = pm.Uniform(f'q', 2.001, 30)  # degrees fo freedom q > 2
    sigma = pm.Uniform(f'sigma', 0.5, 1000)  # sigma > 0
    mu = pm.StudentT(f'mu', nu=4, mu=obs_all.mean(), sigma=max(obs_all.std() * 3, 15))

    obs = SkewedStudentT(f'obs', lam, q, sigma, mu, observed=obs_all)
    #obs = pm.StudentT(f'obs', nu=q, mu=mu, sigma=sigma, observed=obs_all)

    idata = pm.sample(return_inferencedata=True)
az.plot_posterior(idata, ref_val=[loc, 0, df, scale]);

using your custom distribution. Notice that both q and sigma are outside their respective posterior regions

using PyMC3’s

A few more general points

  • Uniform priors are generally discouraged, you should use more informative priors.
    ** For scale this can be Exponential, HalfNormal or HalfCauchy
    ** For StudentT df parameter you can use Gamma(2.5, 0.1)
    ** For mu you can probably just use Normal rather than StudentT, your sigma setting is already wide enough
    ** you can get more information here

  • You are changing the default settings in pm.sample(), unless you have strong reason to do this it is best to keep the defaults. In particular you should use the NUTS sampler whenever possible.

2 Likes