From the Stan Manual:
Suppose X is a K-dimensional random variable with probability density function p_X(x). A new random variable Y = f ( X ) may be defined by transforming X with a suitably well-behaved function f . It suffices for what follows to note that if f is one-to-one and its inverse f ^ {− 1} has a well-defined Jacobian, then the density of Y is
p_Y ( y ) = p_X ( f ^ {− 1} ( y ) ) ∣ det J _ {f ^{ − 1}} ( y ) |
Here we have x \sim \text{Uniform}, with x[0] = \frac{\alpha}{\alpha + \beta}, x[1] = (\alpha + \beta)^{-1/2}. This means that we have density function of p_X(x) = c, and f^{-1} (\alpha, \beta) = [\frac{\alpha}{\alpha + \beta}, (\alpha + \beta)^{-1/2} ]
Putting it together into sympy
, we got:
from sympy import symbols, Matrix, simplify, solve, Eq, Abs, log
alpha, beta = symbols('alpha, beta', positive=True)
a, b = symbols('a b')
f = solve((Eq(alpha / (alpha + beta), a),
Eq((alpha + beta)**(-1/2), b)),
alpha, beta)
f_inv = solve((Eq(alpha / (alpha + beta), a),
Eq((alpha + beta)**(-1/2), b)),
a, b)
X = Matrix([f_inv[a], f_inv[b]])
Y = Matrix([alpha, beta])
det = X.jacobian(Y).det()
det = det.subs([
(alpha, f[alpha]),
(beta, f[beta])]).subs(b, (alpha + beta)**(-1/2))
p_a_b = simplify(Abs(det))
p_a_b
which evaluate to \displaystyle \frac{0.5}{\left(\alpha + \beta\right)^{2.5}}
And similarly:
x, y = symbols('x y')
rewrited = solve((Eq(x, log(alpha / beta)),
Eq(y, log(alpha + beta))),
alpha, beta)
X = Matrix([rewrited[alpha], rewrited[beta]])
Y = Matrix([x, y])
det = X.jacobian(Y).det().subs([
(x, log(alpha / beta)),
(y, log(alpha + beta))])
simplify(Abs(det)) * p_a_b
which evaluate to \displaystyle \frac{0.5 \alpha \beta}{\left(\alpha + \beta\right)^{2.5}}