Online update as new data comes in

Hello @Chandan_Gupta, I am working on a Bayesian network that utilizes new observations as they come. The probem @cluhmann describes is that you initialize the model with priors, and as they wrote:

The model context never “knows” about the posterior samples.

Let’s say that your system has 2 measurements available in a series.

First measurement

you update the model with the measurement via the observed= keyword in the definition of the probality distribution function(e.g.
pm.Normal(..., observed=np.array([your_observation_data])). You get the posterior from the return_inferencedata=True flag in the pm.sample but now you need to put the posterior into your next prior.

Second measurement

To utilize the previous measurement, you need to use the posterior as a new prior, but the with context only contains the original priors. So to use the previous measurement you can reinitialize the model with new parameters. I solve this problem with having a pm.Dirichlet() distribution over other distributions which are for example Mixtures taking weights as a paramter → pm.NormalMixture()s. The Dirichlet distribution takes in alpha= keyword, which is the vector forming the probabilities in the K-1 simplex (triangle generalized to higher dimensions). The alpha vector can be roughly initialized directly with the probabilities or the vector can be fit with a hierarchical model for better precision. I have shown this here.

Although this reply is kinda specific, I hope the general problem of re-initializing the distribution got through. Many seemingly difficult problems can be solved using hierarchical modelling, where you model the hyperpriors of the distributions instead of the distributions directly.