# Coherent / dependent Normal distributions

Hello,
I have implemented a model where I have a parameter alpha, implemented as \alpha \sim N(0,2). I want an alpha for every feature of model k and for every possible time step t. Because of that alpha has the shape (k \times t).
Just implemented like this the posterior values of \alpha[k_0][t_0] and \alpha[k_0][t_1] are independent of another and the value can “jump”. This behavior can be seen in the picture, two big jumps in the beginning and one in the end.

The idea to stop that is to model alpha as \alpha_0 \sim N(0,2) and \alpha_t \sim N(\alpha_{t-1}, 2). I hope that that would remove the jumps and leads to a smoother curve. I dont know how to actually implement that in pymc. Can you help me in this point?

If you have another idea how to eliminate the jumps I would happily take that to.

If there are any unclear things, please just ask :).

And finally if you read until here, thanks for your time and your help!

What value is jumping? It might help to include a small example model illustrating the behavior you are describing.

Thanks for your reply. I have a feature k and the importance of the feature differs depending on the timestep t. I want to model this varying importance with an alpha \alpha \sim N for every timestep. An example of this can be seen in the picture I provided. The x values are the timesteps from 0 to 99 and the y values are the fitted posterior values for each alpha. What I don’t like about this picture are the jumps / or the really high changes of the alpha value. From 0 to 1 it goes from -200 to 150 for the importance of this feature. This just doesn’t make any sense to me. If the feature is not important in t=0 it can’t / shouldn’t be that important in t=1. Therefore, I would like to make the different alphas kind of dependent. The first idea was to model alpha as \alpha = N(\alpha_{t-1}, 2) but I am not sure how / of that is possible.

I hope that makes my problem more clear.