Multiple observations sharing priors and likelihood model


#1

Hi,

I would like to create a model in PyMC3 to sample the posterior using multiple observations which share the same priors and likelihood model. What would be the best way of doing this?

To start, I should provide the basis for this work. So the idea here is that we have made measurements with two detectors (say Det1 and Det2) and they each provide a spectrum which is basically a vector of 90 numbers. The linear model that relates the measured spectrum (M) to the response matrix ( R ) and the incident spectrum ( P - prior ) is given as follows:

M_{det1} = R_{det1,P1} * P1 + R_{det1,P2} * P2
M_{det2} = R_{det2,P1} * P1 + R_{det2,P2} * P2

I have measurements M_{det1} and M_{det2}, as well as, R_{det1,P1}, R_{det1,P2}, R_{det2,P1}, and R_{det2,P2}.

What would be best way to specify this model? Something like Model 1 in this post (Gaussian Mixture of regression)?

Thanks for your help.


#2

You should start with a linear regression with two columns of response, eg:
P1 ~ some prior
P2 ~ some prior
M_{det1} ~ Normal(R_{det1,P1} * P1 + R_{det1,P2} * P2, error_1)
M_{det2} ~ Normal(R_{det2,P1} * P1 + R_{det2,P2} * P2, error_2)

Then you can introduce more complex structure, such as the correlation between M_{det1} and M_{det2}:
cov ~ LKJ
mu1 = R_{det1,P1} * P1 + R_{det1,P2} * P2
mu2 = R_{det2,P1} * P1 + R_{det2,P2} * P2
[M_{det1}, M_{det2}] ~ MvNormal([mu1, mu2], cov)


#3

As a follow-up, I’ve managed to successfully implement the idea using the following code:

# Define the prior probability densities 
self.P1 = pm.Uniform('P1', lower = lbP1, upper = ubP1, shape = shapeP1)
self.P2 = pm.Uniform('P2', lower = lbP2, upper = ubP2, shape = shapeP2)

# Define the models
self.M_det1 =  theano.tensor.dot(theano.shared(self.R_det1.T), self.P1) + theano.tensor.dot(theano.shared(self.R_det1.T), self.P2)

self.M_det2 =  theano.tensor.dot(theano.shared(self.R_det2.T), self.P1) + theano.tensor.dot(theano.shared(self.R_det2.T), self.P2)
            
# Define the likelihood
self.L_det1 = pm.Poisson('Likelihood_Det1',  mu = self.M_det1, observed = theano.shared(self.Data_Det1, borrow = False), shape = (self.Data_Det1.size, 1))

self.L_det2 = pm.Poisson('Likelihood_Det2', mu = self.M_det2, observed = theano.shared(self.Data_Det2, borrow = False),  shape = (self.Data_Det2.size, 1))

This seems to work pretty well so far. Comments?


#4

:+1:

My comment would be: try also model P1 and P2 as a normal distribution, and do tt.exp(M_det1) and tt.exp(M_det2) for the mu of Poission. This is usually a more standard way (i.e., Generalized linear model)