Modelling timevarying coefficients for forecasting

I am trying to model a regression problem with timevarying coefficients where i want to forecast these coefficients over an horizon.

Consider e.g the model yt = x1t * timevaryingcoefficient1 + x2t * timevaryingcoefficient2
We fit the model to historic data.
Now i want to extrapolate/forecast these timevaryingcoefficients over an horizon to be able to construct an stochastic optimization problem with the goal of maximizing y over the horizon by deciding x’s.

Using e.g gaussianrandomwalks as distributions for the timevarying coefficients would not cut it as i need a timevaryingcoefficient that can extrapolate reliably… are there any examples you could provide me with that tackled this problem or do i have to use e.g spline regression and place smart knots in this scenario?

Hi!

What does “extrapolate reliably” mean to you in this case? Do you have specific time series dynamics (seasonality, trend, ARMA dynamics, etc) that you want to build into the coefficients?

In general, if you can write down an equation for the time series process describing the evolution of the coefficients, you can directly implement them into your model using latent variables.

1 Like

I aim to capture the seasonality in the regression coefficients.

To give you a little bit of context to the problem:

Consider the resource allocation problem where we want to allocate an portion of the total resource B to two investment options every day over an finite horizon H(30 days in our example).

Now consider the following algorithm aiming to solve this:

  1. Fit our reward function to the historical data by using bayes rule.
  2. Sample an set of parameter values.
  3. Use the above parameter values as ground truth and maximize the reward function over the whole horizon(30 days) s.t to the budget constraints.
  4. Set the allocation for the current timestep.
  5. See the reward for the current timestep.
  6. Add the allocation and reward to our dataset.
  7. Repeat step 1.

The reward function is the aim of this discussion.
It could be thought of as Y_t = intercept + c_1 * x_t_1 + c_2 * x_t_2 + c_3 * trend_t
Where the trend aim to capture the underlying market trend and x_t_1 and x_t_2 are our decision variables.
Notice that we aim to optimize over an horizon of 30 days meaning that we optimize over an entire month at a time(although we solely implement the decision at our current timestep). Therefore i would really like to capture the seasonality effects w.r.t day of month and day in week for our decision variables coefficients(c_1, c_2).
So what i am trying to get at is finding examples in pymc3 where people used timevariant coefficient aiming to capture seasonality effects such as day of month and day in week.
Another approach would be to use spline regression w.r.t day of month for example. Such an example would also be interesting to look at.

Interesting problem!

In the example gallery, there is an example of spline regression. As for time varying regression, I’m not aware of an example that goes above and beyond GRW. @twiecki gives an example of using GRW to model a latent variable in this video, as well as some additional extension to the time-varying process. It’s not precisely what you want, but I think what you will end up doing will be very close to what he does here.

For example, you could model the latent coefficient as a parameterized function of Fourier features, like those used in the TimeSeers Prophet package, replacing x_{i,t} with x_{i,t} = \sum_{k=1}^K a_{i,k} \cos \left ( \frac{2kt\pi}{P} \right ) + b_{i,k} \sin \left ( \frac{2kt\pi}{P} \right ), which would inject a seasonal fluctuation into the parameter values.

In general, my strategy would just be to figure out what kind of time series dynamics you want, then replace x_{i,t} with those dynamics in your equation.