Andrew Gelman responds to this question on how to include prior information on quantities of interest, not just on parameters in your model with a bit of Stan Code.
Link: https://statmodeling.stat.columbia.edu/2019/08/23/yes-you-can-include-prior-information-on-quantities-of-interest-not-just-on-parameters-in-your-model/
model {
target += normal(y | a + b*x, sigma); \\ data model
target += normal(a | 0, 10); \\ weak prior on a
target += normal(b | 0, 10); \\ weak prior on a
target += normal(a + 5*b | 4.5, 0.2); \\ informative prior on a + 5*b
How would one code the last line in PyMC3? I know how to put as expression as the mu
parameter of a Normal dist, but how do you put these strong experiential priors on that parameter?
1 Like
There would actually be no problem in allowing pm.Normal('a', mu=4.5, sd=0.2, observed=a + 5 * b)
. We explicitly check for that and throw an error to prevent people from doing really strange things. If you want, you can always just use a potential: pm.Potential('name', pm.Normal.dist(mu=4.5, sd=0.2).logp(a + a * b))
1 Like
I think even right now you can do it this way, see: https://nbviewer.jupyter.org/github/junpenglao/All-that-likelihood-with-PyMC3/blob/master/Notebooks/Regression%20with%20a%20twist.ipynb
But agree that using a potential is better, so dont try this at home:grimacing:
with pm.Model() as m2:
beta = pm.Normal('beta', 0, 10)
a = pm.Normal('a', 0, 10)
sd = pm.HalfNormal('sd', 5)
pm.Normal('eps', 0., 1., observed=(y - X*beta - a)/sd)
pm.Potential('jacob_det', -tt.log(sd)*len(y))
trace2 = pm.sample()
2 Likes
Oh interesting, cool, thanks!