Assume z = \{z_t \in \mathbb{R}: t=1,2\} and x=\{x_t \in \mathbb{R}^p: t=1,2..\}. Let D_t=\{(z_1,x_1),(z_2,x_2)...\}.
I want to model z_t given x_t. The normal likelihood is this: z_t | w,\beta,x_t \sim N(x^T_tw,\beta^{-1}). The conjugate prior chosen is this (normal gamma): w,\beta \sim N(w|m_0,\beta^{-1}S_0)Gam(\beta|a_0,b_0). Initial values are m_0=0, S_0 is the identity matrix, a_0=1, and b_0=100.
Let’s suppose I get some new points (z_{t+1},x_{t+1})
Bc we chose a conjugate prior, the posterior is also normal Gamma: w,\beta | D_t,z_{t+1},x_{t+1} \sim N(w|m_{t+1},\beta^{-1}S_{t+1})Gam(\beta|a_{t+1},b_{t+1}) where S_{t+1}, m_{t+1}, and b_{t+1} are defined in an iterative fashion (the exact way they are defined is not necessary for my question).
How would I go about coding up the likelihood and prior? I know this Bayesian process must repeat for every t. Let’s suppose that in this particular t, x_t = [5,2] (so p=2) and z_t is 7. So for obs, zts[-1]=7
xt = [5,2]
with pm.Model() as model:
# conjugate priors
wbeta = pm.DensityDist('wbeta', ?some function for Normal Gamma?, shape=2, testval = [1,1])
w = wbeta[0]
beta = wbeta[1]
# likelihood
obs = pm.Normal("obs", mu=xt*w,sd=np.inv(beta), observed=zts)