I have a question regarding a change of variables on the observed space.
I am setting up a pymc3 model to perform a regression on the scaled log-odds of a success rate variable and then apply the inverse transformation to the posterior to interpret the results in the original variable space. Formally, I have an observed variable \theta \in (0, 1) and I apply the transformation h = f \circ g where g(\theta) = \text{logit}(\theta) and f(\theta) = \frac{\theta - m}{s} with m = \bar{\text{logit}(\theta)} and s ^ 2 = \frac{1}{n}\sum_{i=1}^n (\text{logit}(\theta_i) - \bar{\text{logit}(\theta)})^2. I then perform the inference with the model h(\theta) \sim \text{N}(\beta X, \sigma) and obtain a posterior distribution that I apply h^{-1} directly to.
That is, my code would look something like this
success_rate = np.array([0.01, 0.02])
logodds = scipy.special.logit(success_rate)
m = logodds.mean()
s = logodds.std()
observed = (logodds - m) / s
design_matrix = np.array([[1, 0, 0], [0, 1, 0]])
with pm.Model() as model:
# Declare regression coefficients beta....
mu = theano.sparse.dot(design_matrix, beta)
sigma = pm.HalfNormal('sigma', sd=1)
likelihood = pm.Normal('likelihood', mu=mu, sd=sigma, observed=observed)
with model:
trace = pm.sample()
posterior = pm.sample_posterior_predictive(trace)['likelihood']
# logit inverse is expit.
predictions = scipy.special.expit(s * posterior + m)
I was following along with the Stan article here: https://mc-stan.org/users/documentation/case-studies/mle-params.html that mentions that if a change of variable occurs, a Jacobian adjustment needs to take place in order to preserve the probability mass under the change.
My question is, how can I correct my above procedure to take the Jacobian adjustment into account?