# How does prior predictive work when parameters depend on 'input data', x?

Conceptually, if your data is split into input x, and output y, and you set up a Bayesian model, for example, like

b ~ N(0, 10)
a ~ N(0,10)
sigma ~HN(0,10)
y ~ N(a*x+b, sigma)

I don’t understand conceptually how you do prior predictive check … because you can sample a,b, and sigma … but then y lies in a distribution whose mean depends on x … so we would need some x values as well to get a distribution of y values …

My guess is … you have to provide the inputs x, then you can sample a,b,sigma … and then you can sample y to get, for any fixed input x, a distribution of possible y values in N(a*x+b, sigma) …
(But then this gives a distribution for every input x in your data set … which seems too complicated …)

1. Is this guess right? (if no, can you correct me please)
2. Is it better (more robust) to model the inputs x as well (add more parameters that model the x inputs) and then your prior predictive check would produce a distribution for pairs (x,y) ? What is the standard practice for this …

Welcome!

1. That is correct.
2. Better is a highly context-specific thing. But whether you model the “inputs” is typically more of a modeling decision than a decision about how to conduct a prior predictive check (i.e., sometimes you just want to take your predictors as given and not model the generative process by which they come about and other times, you do). That being said, you would ultimately need to figure out if you wanted your prior predictive to inform the question “do my priors match my intuition about my model and my observed input/predictor values?” (in which case you should just use your observed inputs/predictors) or the question “do my priors match my intuition about my model and plausible input/predictor values?” (in which case you may choose to figure out what plausible values look like, generate some, and then conduct your prior predictive check).
2 Likes

Thanks, great to be here.
Thanks so much, that’s great feedback. Really appreciate it …

1 Like