PyMC Sampling Error: Latent GP for GEV distribution

Hi, I’m using latent GPs for a hierarchical extreme spatial model whose marginals are GEV distributions. So, I’m using also pymc_experimental for GEV distributions.

My code is as follows.

with pm.Model() as model:
    ell = pm.InverseGamma('ell', alpha=2, beta=1, shape=(3, 20))
    # eta = pm.HalfNormal('eta', sigma=1, shape=3)

    cov_shape =, ls=ell[0])
    cov_loc =, ls=ell[1])
    cov_scale =, ls=ell[2])

    gp_shape =
    gp_loc =
    gp_log_scale =

    xi = gp_shape.prior('xi', X=X_)
    mu = gp_loc.prior('mu', X=X_)
    log_sigma = gp_log_scale.prior('log_sigma', X=X_)
    # sigma = pm.Deterministic('sigma', pm.math.exp(log_sigma))
    yy = pmx.distributions.GenExtreme('yy', mu=mu, sigma=pm.math.exp(log_sigma), xi=xi, observed=ys_maxima.T.ravel())
    trace = pm.sample(1000, target_accept=0.95, random_seed=10)

Then, it returns the following error.

Auto-assigning NUTS sampler...
Initializing NUTS using jitter+adapt_diag...
Output exceeds the size limit. Open the full output data in a text editor---------------------------------------------------------------------------
SamplingError                             Traceback (most recent call last)
Cell In[74], line 2
      1 with model:
----> 2     mp = pm.sample(1000, target_accept=0.95, random_seed=10)

File c:\XXX\Lib\site-packages\pymc\sampling\, in sample(draws, tune, chains, cores, random_seed, progressbar, step, nuts_sampler, initvals, init, jitter_max_retries, n_init, trace, discard_tuned_samples, compute_convergence_checks, keep_warning_stat, return_inferencedata, idata_kwargs, nuts_sampler_kwargs, callback, mp_ctx, model, **kwargs)
    617 ip: Dict[str, np.ndarray]
    618 for ip in initial_points:
--> 619     model.check_start_vals(ip)
    620     _check_start_shape(model, ip)
    622 # Create trace backends for each chain

File c:\XXX\Lib\site-packages\pymc\, in Model.check_start_vals(self, start)
   1776 initial_eval = self.point_logps(point=elem)
   1778 if not all(np.isfinite(v) for v in initial_eval.values()):
-> 1779     raise SamplingError(
   1780         "Initial evaluation of model at starting point failed!\n"
   1781         f"Starting values:\n{elem}\n\n"
   1782         f"Logp initial evaluation results:\n{initial_eval}"
   1783     )

SamplingError: Initial evaluation of model at starting point failed!
Starting values:
{'ell_log__': array([[ 0.01571654, -0.34856599, -0.49873131, -0.57287988,  0.95534123,
       -0.89331143,  0.72452945,  0.37501074, -0.41410939,  0.51900205,
        0.98783747, -0.82823462, -0.35654576, -0.91625861, -0.41124514])}

Logp initial evaluation results:
{'ell': -66.43, 'xi_rotated_': -1089.17, 'mu_rotated_': -1090.66, 'log_sigma_rotated_': -1084.57, 'yy': -inf}

Well, I took exponential to log_sigma_rotated in different ways.
i) Defining sigma = pm.Deterministic('sigma', pm.math.exp(log_sigma));
ii) sigma=pm.math.exp(log_sigma) and pass it to GenExtreme’s sigma; and
iii) as shown in the above code.
But all of these return the same error.

FYI, I specified the random_seed in pm.sample to try different initializations just in case. Interestingly, inference for the same hierarchical model for 1-dimensional simple simulation works.

Can you share a reproducible example (with the data - or code that generates the data)?

No problem. This is the data.
X_.npy (156.4 KB)
ys_maxima.npy (7.9 KB)

The problem is xi must be between -1 and 1. Your prior goes out of this domain when jittering is applied at the beginning of sampling


@ricardoV94is there a way to give a better error message here when parameters fall outside their domains? I can update the GEV code if so.

Thanks! But I think that is quite weird because xi can be (theoretically) any real value… well… practically it should be within the range.

I think the message is fine, I used the new model.debug to find the issue and it was pretty clear.

The hardest part was finding the point that caused it to fail because it was not the initial point but the jittered initial point.

I am not familiar with the distribution but @ccaprani should have an idea why the constraint is set as such

Thanks @ricardoV94 - will leave it alone so.

@thisistheway you’re right that it theoretically can be infinite but in practice it is not. The +/-1 bounds could be expanded, but I’ve never come across any practical problem where this is the case. Indeed, anything outside +/- 0.5 is very rare. This paper gives a good discussion on the shape parameter, as well as setting priors for it and points to other papers where the practical values are made clear:

TLDR - if you have no preference on tail behaviour then xi ~ N(0,0.5) is very reasonable as a prior.

But if the logp is not undefined outside of -1/1 we shouldn’t hardcode it as a constraint.

@ricardoV94 I agree with that. Although the boundary condition is plausible in practice, that should be incorporated as a form of prior, not a constraint.
@ccaprani It makes me quite guilty but can’t understand why the constraint is required in the code. The hierarchical model in the code uses a GP for xi, and it is a valid model, also realized in the R package–SpatialExtremes. The boundary condition may prevent implementing the latent variable model with PyMC, which I love.

Thanks, all!

@thisistheway happy to look at it further, but a large xi is very numerically unstable. I’ll have a dig around and see if other packages have a treatment for it that I haven’t seen yet.