Truncated Normal

I also get divergence problems when sampling from a HalfNormal

with pm.Model() as m: 
  x = pm.HalfNormal('x', 1) 
  trace = pm.sample() 
  # Divergences

I think NUTS is struggling to sample because the peak of the distribution is at the very boundary of the domain. The problems seem to go away if the mean is enough away from the boundary:

with pm.Model() as m: 
  x = pm.TruncatedNormal('x', lower=0, mu=3, sigma=1) 
  trace = pm.sample() 
  # No Divergences

If you want NUTS to sample you might want to give it an extra legroom by tweaking tune and target_accept:

with pm.Model() as m: 
  x = pm.TruncatedNormal('x', lower=0, mu=0, sd=5) 
  trace = pm.sample(tune=2000, target_accept=.95) 
  # No divergences

If anyone else can chime in and confirm this is reasonable behavior from NUTS, that would be really helpful.

This might be related: Zero-excluding priors are probably a bad idea for hierarchical variance parameters « Statistical Modeling, Causal Inference, and Social Science

1 Like