PyMC sampler does not converge

I have the following model where I use the horseshoe prior for regularization purposes:

with pm.Model() as re_horse:

    # Level 2 intercept
    v_2 = pm.Normal(name="v_2", mu=0.0, sigma=1.0, shape=())
    # Level 2 error term
    eps_2 = pm.Normal(name="eps_2", mu=0.0, sigma=1.0, shape=(n_matches, 1))
    # Random intercept
    gamma = pm.Deterministic(name="gamma", var=v_2 + eps_2)

    lam_1 = pm.HalfCauchy(name="lam_1", beta=1.0, shape=n_vars_1)  # lam allows parameter to escape shrinkage
    lam_2 = pm.HalfCauchy(name="lam_2", beta=1.0, shape=(n_vars_2, 1))
    tau_1 = pm.InverseGamma(name="tau_1", alpha=3.0, beta=1.0, shape=())  # shrinkage param, large tau means low shrinkage
    tau_2 = pm.InverseGamma(name="tau_2", alpha=3.0, beta=1.0, shape=())

    # Level 2 variables
    K_2 = pm.Normal(name="K_2", mu=0.0, sigma=tau_2**2 * lam_2**2, shape=(n_vars_2, 1))

    # Level 1 intercept
    v_1 = pm.Deterministic(name="v_1", var=pm.math.dot(X_2, K_2) + gamma)

    # Level 1 variables
    K_1 = pm.Normal(name="K_1", mu=0.0, sigma=tau_1**2 * lam_1**2, shape=n_vars_1)

    # Model error
    eps_1 = pm.Gamma(name="eps_1", alpha=9.0, beta=4.0, shape=())

    # Model mean
    y_hat = pm.Deterministic(name="y_hat", var=pm.math.dot(X_1, K_1) + v_1)

    # Likelihood
    y_like = pm.Normal(name="y_like", mu=y_hat, sigma=eps_1, observed=y)

with re_horse:
    step = pm.NUTS(target_accept=0.9)
    trace_re_horse = pm.sample(draws=20000, chains=2, tune=20000, init="auto", return_inferencedata=True,
                               random_seed=SEED, cores=1, step=step)  

However, I cannot get rid of the sampling issues reported below:

There were 4210 divergences after tuning. Increase target_accept or reparameterize.
The chain reached the maximum tree depth. Increase max_treedepth, increase target_accept or reparameterize.
There were 10420 divergences after tuning. Increase target_accept or reparameterize.
The chain reached the maximum tree depth. Increase max_treedepth, increase target_accept or reparameterize.
The rhat statistic is larger than 1.4 for some parameters. The sampler did not converge.
The estimated number of effective samples is smaller than 200 for some parameters.

I already tried to increase the number of tuning steps from 2000 to 20000, and the same to the number of iterations without success.
Moreover I tried different (hyper-)parameter combinations e.g. for alpha and beta, and increased the target_accept to 0.9. Also without success.

How can I fix the problem?

Many thanks in advance!

Do you see any particular parameter is difficult to sample from? Twisting the prior for those usually is what I would try first.

No particular, there are many and it looks random.
Twisting the priors is what I tried first.

Hard to provide further advice without knowing the data - for example, what is the size of your data?

1 Like