Rat tumor: question of prior choice

Hi,

I am new to bayesian and have been going through the BDA book along with the PYMC3 examples. I am a little stuck on the prior choice for rat tumor example (BDA pg 110 and pymc3 notebook).

Specifically, what is the math (ideally with the Jacobian’s) that takes us from \alpha, \beta to equation 5.9 and then to equation 5.10 (the log transformation). What is the reason for choosing the log transform when we already transform (\alpha, \beta) to (\frac{\alpha}{\alpha + \beta}, (\alpha + \beta)^{-1/2})?

Thanks in advance.

From the Stan Manual:

Suppose X is a K-dimensional random variable with probability density function p_X(x). A new random variable Y = f ( X ) may be defined by transforming X with a suitably well-behaved function f . It suffices for what follows to note that if f is one-to-one and its inverse f ^ {− 1} has a well-defined Jacobian, then the density of Y is

p_Y ( y ) = p_X ( f ^ {− 1} ( y ) ) ∣ det J _ {f ^{ − 1}} ( y ) |

Here we have x \sim \text{Uniform}, with x[0] = \frac{\alpha}{\alpha + \beta}, x[1] = (\alpha + \beta)^{-1/2}. This means that we have density function of p_X(x) = c, and f^{-1} (\alpha, \beta) = [\frac{\alpha}{\alpha + \beta}, (\alpha + \beta)^{-1/2} ]

Putting it together into sympy, we got:

from sympy import symbols, Matrix, simplify, solve, Eq, Abs, log
alpha, beta = symbols('alpha, beta', positive=True)
a, b = symbols('a b')
f = solve((Eq(alpha / (alpha + beta), a),
           Eq((alpha + beta)**(-1/2), b)),
          alpha, beta)
f_inv = solve((Eq(alpha / (alpha + beta), a),
               Eq((alpha + beta)**(-1/2), b)),
              a, b)

X = Matrix([f_inv[a], f_inv[b]])
Y = Matrix([alpha, beta])
det = X.jacobian(Y).det()
det = det.subs([
    (alpha, f[alpha]),
    (beta, f[beta])]).subs(b, (alpha + beta)**(-1/2))
p_a_b = simplify(Abs(det))
p_a_b

which evaluate to \displaystyle \frac{0.5}{\left(\alpha + \beta\right)^{2.5}}

And similarly:

x, y = symbols('x y')
rewrited = solve((Eq(x, log(alpha / beta)),
                  Eq(y, log(alpha + beta))),
                 alpha, beta)
X = Matrix([rewrited[alpha], rewrited[beta]])
Y = Matrix([x, y])
det = X.jacobian(Y).det().subs([
  (x, log(alpha / beta)),
  (y, log(alpha + beta))])
simplify(Abs(det)) * p_a_b

which evaluate to \displaystyle \frac{0.5 \alpha \beta}{\left(\alpha + \beta\right)^{2.5}}