Greetings,
I am referring to Haslett and Parnell (2008) to build an age-depth model for radiocarbon determinations. I wanted help with both statistics and how should I approach it to implement it using pymc.
Suppose a dataset with set of points (x, y) are monotonically increasing and we assume that increments between a segment on x-axis follow gamma distribution and number of increments (knots) between a segment follows Poisson distribution. We need to infer λ and β (α is fixed at 4). Process is defined as x(y).
we write, \Omega_j = x_s(y_{s,j}) - x_s(y_{s,j-1}) and \omega_j = y_{s,j} - y_{s,j-1}, so that
with G_i \sim gamma(\alpha, \beta) and N(\omega_j) \sim Poisson(\lambda\omega_j). The posterior for \lambda and \beta is obtained from
with \{\Omega, \omega\} denoting complete set of \{\Omega_j, \omega_j\} pairs. The likelihood \pi(\lambda, \beta\vert\Omega, \omega ) = \prod_{j=1}^{m-1}(\Omega_j\vert\alpha,\beta,\lambda,\omega_j) is now of the form that is given in equation below (probability density of CPG) with \Omega_j=x and \lambda y = \lambda\omega_j.
Infinite sum is not available analytically, but I have found a fast routine Dunn and Smyth (2005) and for now I have implemented it using just numpy. And I approximate the derivative.
My question here is, what do we actually observe here? We definitely observe \omega_j = y_{s,j} - y_{s,j-1} but do we also observe \Omega? Because for now, we assume that x doesn’t have any uncertainity, but later x will have noise and we would then have to also infer x. Another question that I also have is, do I need to make CustomDist
for CPG? Because like I said, when x will have noise, we will use CPG as a joint prior over it.
Thank you