Thanks for making the notebook @ricardoV94.
My understanding was that Wald distribution is the inverseGaussian distribution and PyMC has Wald distribution. This distribution has several possible parameterizations. The shiftedWald implementation you proposed in the notebook, as I understand, is another parameterization of the Wald distribution.
In shifted wald, there is an additional parameter, theta, that get’s substracted from the observed data. If you may be interested, equation 4 in this paper mentions the shifted Wald equation Psychological interpretation of the ex-Gaussian and shifted Wald parameters: A diffusion model analysis | SpringerLink.
Also, thank you for creating the request to implement the new distribution. If the community finds it interesting, I would also request to look into the Wiener distribution. This is a heavily studied distribution in decision-making. Stan-Wiener has an implementation of the Wiener distribution. Also, the HDDM package built on an older version of PyMC has this distribution, but it’s not there in the latest PyMC. I tried to replicate Stan and HDDM code for Wiener in the latest PyMC but I could never bring it below a rhat of 2-4 for several of my datasets, have been trying since the beginning of this whole year
.