Intuition behind different priors in this spline-like knot scheme for timevariant data

I guess my first piece of advice is to make the models and sample some prior predictive draws. Here’s what I got for a gaussan random walk and you knotting scheme, with x_t = 1 \quad \forall t . Left plot is the prior predictive distribution, right plot is the mean of that distribution (I made y_t normally distributed with \sigma \sim HalfNormal(1):


As you can see, the knots build in some periodicity based on the day of week. I think your weighting scheme would just smooth this periodicity, it looks something like a weighted moving average.

If you added a prior correlation matrix, you would be assuming that large/small values of parameters co-occur, so for example if \beta_1 is large then \beta_2 is likely to be also (or the inverse).

But you do not need to build correlations into the priors for there to be correlations in the posterior. This can be seen easily by fitting a simple slope-intercept model, y_i = \alpha + \beta x_i. If \beta is large, \alpha must be small, because that’s how lines work (if the slope is steep, the y-intercept must necessarily be lower). This correlation will be captured by the posterior regardless of whether you explicitly model it in the priors.

Knowing nothing about your specific application it’s harder to give more advice, but in general I am leery of putting time dynamics like seasonality into the parameters, rather than as latent components of the model. Thinking about the effect of the temperature on demand for ice cream for example, why would I expect the strength of that connection to vary over time? I more just expect there to be a seasonal pattern in ice cream sales, which is captured by changes in the temperature.

My point is you could probably just include day-of-week effects in whatever model you’re considering and get a more interpretable result.

1 Like