Hierarchical Model with Time-Varying Parameters

Let’s say we took the radon example, and took annual measurements. Let’s ignore the floor part of the problem too, and just think about this model:

radon_{i,c,t} = \mu_t + \alpha_{c,t} + \epsilon_c

so that \mu_t reflects the changing mean across counties over time, and \alpha_{c,t} captures changing county offsets. We want to encode our prior assumption that the \alpha_{c,t} won’t change much year-to-year. What is the best way to do this?

I’ve thought of a couple options, but neither is perfect.

  1. Model \mu_t and \alpha_{c,t} as Gaussian Random Walks. We can constrain \alpha_{c,0} to be centered around zero using the init argument. The issue here: is modeling the walks with drift=0 enough to constrain the mean of \alpha_{c,t} from drifting over time? My intuition is that the mean of \alpha_{c,t} could drift in one direction while \mu_t drifts in the opposite direction.

  2. Model \alpha_{c,t} as a MvNormal. This ensures that each year will be have mean zero, but using the covariance matrix to encode our assumption of minimal year-to-year change seems awkward.

In a related question: if this were a single-year model with no time-variation, we would use a non-centered parameterization for \alpha_{c}. Is there a corresponding non-centered parameterization for time-varying \alpha_{c,t}?

My next step is to generate some simulated data and run some tests, but I thought I’d ask first to see how other folks have solved this problem.

1 Like

I’ve used the Gaussian random walk as a coefficient and intercept before. It works pretty well, IMO. You could also represent the time-varying effects as splines.

With regard to the drift, you are right to recognize that drift=0 isn’t enough to prevent anticorrelated deviations in the \mu_t and the \alpha_{c,t} terms. What about placing priors on the scale of the GRW jumps that are smaller for the \alpha and larger for the \mu?

1 Like