"Local level - Nile" State Space Model (Kalman Filter) in PyMC3

I have just finished reading Time Series Analysis by State Space Methods: Second Edition by James Durbin and Siem Jan Koopman and would like to implement some of the examples in PyMC3.

I’ve just done that for the local level model and compared it against the example given by Chad Fulton in Estimating time series models by state space methods in Python: Statsmodels. Please look at my example notebook for details of the implementation.

First of all I wanted to provide a code example for others as a starting point in case somebody else would like to do something similar.

Second I’d like to have your feed-back if that’s the way to go or if there are other/better ways to do it. I noticed that on several runs with more tuning steps the r_hat of the estimated parameters got very high, sometimes above 1.5. In the one run that I saved that’s sadly not the case, but I ran the notebook several times and often the r_hat values would be high plus there were several/many divergencies. Therefore I guess there’s something not quite right with my implemenation.

As I’d like to implement several other models from Time Series Analysis by State Space Methods: Second Edition it would be good to have a solid starting point.

P.S.: it is on purpose that I did not use something like pm.GaussianRandomWalk(), because this implementation is more “transparent” (you see in detail what’s going on) and you can change it to something non-Gaussian if needed. I’d like to keep that level of transparency if possible.

3 Likes

I work on SSM a lot, using mostly https://www.tensorflow.org/probability/api_docs/python/tfp/sts (which in my opinion also have the best, transparent design).
From a quick look of your notebook, I think it is definitely the way to go using theano.scan. However, the Kalman filtering step doesnt seem right as there is no filtering and predict step (kalmen gain etc).

1 Like

Oh, also @brandonwillard have this extremely nice example: https://brandonwillard.github.io/dynamic-linear-models-in-theano.html

1 Like

I just started to collect documentation about how to get started with tensorflow probability:

Would you have any other examples along which I can start learning?

About the correctness: I reproduced the results from Chad Fulton/statsmodels. The results seem to match very much?

Thank you very much for that collection of examples! I’ll work through them.

The result does match, I guess I am trying to point out that if you want to implement Kalman Filter that take advantage of the Gaussian Conjugacy for updating parameter, there are few more step you need to implement in the update (see @brandonwillard’s post for more detail).
Also note that @brandonwillard’s post use symbolic-pymc, it has a huge advantage that it can isolate state update and state observed into separate theano.scan, and flexibly reuse the output tensor for different filtering. In regular pymc3 you will need to merge these steps into 1 and do one pass update (similar to what TFP currently doing: https://github.com/tensorflow/probability/blob/621678fff3624f7c347efb5b1d78f2f553eee806/tensorflow_probability/python/distributions/linear_gaussian_ssm.py#L1529-L1613)