Hi!
The blog post you linked is really great, but it seems like it mixes up several ideas? There’s no Kalman Filtering at all in the final model. There are Kalman Filter implementations in the pymc-experimental statespace package here, along with an API to SARIMAX models. But you can’t do stochastic volatility models with KF, because the innovations aren’t normally distributed (as I think you know).
It also talks about SIR, but doesn’t ultimately do this either. PyMC does have support for importance sampling via pm.sample_smc
, and some users have been doing work on online sequential importance sampling, but there’s nothing official for that. If your model implies a logp graph that is differentiable, it’s recommended to use NUTS instead (which is what the notebook ultimately does)
So the final PyMC model that is implemented is just computing the one-step prediction errors under the model in the logp
function, which is a great way to do it. Another approach, though, would be to use a scan-based model together with pm.CustomDist
. This is how I’d recommend you proceed. You can use this ARMA notebook and this GARCH notebook* as references on how to do the implementation. Basically you just need to write down the one-step generative process in a function, write another function that executes the scan, then give that to pm.CustomDist
.
*The GARCH notebook is slightly out of date; so follow the ARMA one more closely to get the API right, as use the GARCH as just a reference for how to set up the inner function and scan.