Reparametizing Gelman's rat tumour example

If you don’t apply the transformation, NUTS will give you divergences when the purpose value went out of bound of the support. I guess that’s why in BDA3 they have another equation for unbound parameters log(a/b) and log(a+b) (eq 5.10)

Here the transformation added a jacobian which biased the estimation. To get one with no transformation you can first create the free parameter using HalfFlat and then add the logp using potential:

def logp_ab(value):
    return tt.log(tt.pow(tt.sum(value), -5/2))


with pm.Model() as model:
    # Uninformative prior for alpha and beta
    ab = pm.HalfFlat('ab', 
                       shape=2, 
                       testval=np.asarray([1., 1.]))
    pm.Potential('p(a, b)', logp_ab(ab))
    
    theta = pm.Beta('theta', alpha=ab[0], beta=ab[1], shape=len(n_rat))

    p = pm.Binomial('y', p=theta, observed=n_tumour, n=n_rat)
    trace = pm.sample(1000, tune=2000, nuts_kwargs={'target_accept': .95})
1 Like