Truncated normal distribution with optimal truncation thresholds via DensityDist

Three things, which I am not sure are helpful, but here goes

  1. This example exposes a bug in sample_prior_predictive, which is easy enough to fix later, so thanks!

  2. Here’s your prior. It is worth noting that the truncated normal distribution is sometimes returning inf, which might have to do with extreme values or bad priors?

  3. It wouldn’t be so bad to combine my code snippet below and yours to make a proper TruncatedNormal distribution, if anyone’s interested in a pr…

For what it is worth, this is the code I used for generating the random samples:

import scipy.stats as st

def truncated_normal_random(mu, std, a, b):
    def gen_random(point=None, size=None):
        mu_v, std_v, a_v, b_v = pm.distributions.draw_values([mu, std, a, b], point=point, size=size)
        return pm.distributions.generate_samples(st.truncnorm.rvs,
                                                 a=(a_v - mu_v)/std_v, 
                                                 b=(b_v - mu_v) / std_v, 
                                                 loc=mu_v, 
                                                 scale=std_v,
                                                 size=size,
        )
    return gen_random

...

    x= pm.DensityDist('dist', logp_tr, 
                      random=truncated_normal_random(mu, std, a, b), 
                      observed={'x':truncated, 'mu':mu, 'std':std, 'a':a, 'b':b})

1 Like