Scaling AR(1), conditional inference?

I’d like to do inference of a fairly long AR(1) process. This means there are lots of variables (the innovations of the process) and they are correlated so things start to slow down.

Is it possible to do inference on the other parameters and effectively ignore the posterior of the AR(1) innovations? Is that what is being discussed under “Conditional Inference” here in the Edward docs?

Would this or any other techniques make a larger AR(1) inference possible? e.g. with 1000’s of innovations but only looking at the posterior of the parameters?

Hi @davidia, what do you mean by Conditional inference? Like Gibbs sampling?
I dont think full conditional is easy to do in PyMC3 (in Edward it does not always work as well, you can see the docstring of in edward for more information).

There are many ways to speed up your model, you can start with a small model with few innovations, profile you model each step when you increase the model complexity, and try to speed up the slow part via reparameterization or approximation. It might be helpful if you can provide some code.