Joint priors when implementing conditional means prior on a log-odds model (Binomial)

I have the following basic model where theta (\theta) is a measured value that may predict whether someone reports tinnitus and has_tinnitus is a binary array indicating whether or not the particular subject has tinnitus:

with pm.Model() as tinnitus_model:
    d0 = pm.Normal('delta0', 0, 10)
    d1 = pm.Normal('delta1', 0, 10)
    pm.Bernoulli('T', logit_p=d0 + d1 * theta, observed=has_tinnitus)

We have some pretty good prior knowledge about the prevalence of tinnitus in the general population. However, it can be pretty challenging to transform this information into priors for \beta_0 and \beta_1. But, we can easily specify \theta_i \sim Beta(\alpha_i, \beta_i). According to Equation 2 in Bedrick et al. 1996 (A new perspective on priors for generalized linear models; http://www.jstor.org/stable/2291571), we can come up with priors for two different values of \theta_i and then place an induced prior on . For example, if I have \theta_1 \sim Beta(\alpha_1, \beta_1) and \theta_2 \sim Beta(\alpha_2, \beta_2), then the joint prior for \delta (i.e., d0 and d1 in the model) is

\pi(\delta)\sim F(\theta_1,\delta)^{\alpha_1-1}[1-F(\theta_1, \delta)]^{\beta_1-1}F(\theta_1, \delta)[1-F(\theta_1, \delta)] \times F(\theta_2, \delta)^{\alpha_2-1} [1-F(\theta_2, delta)]^{\beta_2-1}F(\theta_2, delta)[1-F(\theta_2, delta)]

Iā€™m a bit stumped about how to implement this. I tried searching for joint prior and conditional means prior, but could not find any information about either on the PyMC3 forums. Any advice?

Update with solution
I spoke with my statistician who is well-versed in SAS and he sent me some code for the example in the paper I linked above.

*** empty dataset b/c proc mcmc needs a dataset to run;
data x;
run;
 
proc mcmc nmc=100000 nbi=0 outpost=ox data=x;
	  x1 = 55; a1=1; b1=0.577;
	  x2 = 75; a2=0.577; b2=1;
	  xbar=69.6; *** average temperature from 23 oring tests

	  array betacoeffs{2} beta0 beta1;
 
	  **** equation 2 for x1, a1, and b1;
	  pi_beta1 = (   logistic(beta0 + beta1*(x1-xbar)) **(a1-1) ) *
					 ((1-logistic(beta0 + beta1*(x1-xbar)))**(b1-1) ) *
						logistic(beta0+beta1*(x1-xbar))*(1-logistic(beta0+beta1*(x1-xbar)));

	  **** equation 2 for x2, a2, and b2;
	  pi_beta2 = (   logistic(beta0 + beta1*(x2-xbar))**(a2-1) ) *
					 ((1-logistic(beta0 + beta1*(x2-xbar)))**(b2-1) ) *
						logistic(beta0+beta1*(x2-xbar))*(1-logistic(beta0+beta1*(x2-xbar)));
 
	  log_prior = log(pi_beta1) + log(pi_beta2);
	  prior betaCoeffs ~ general(log_prior);
	  parms betaCoeffs {-0.1 0.1};
	  model general(0);
run;

I managed to convert this to pyMC3 code:

def logistic(x):
    return np.exp(x) / (1 + np.exp(x))


with pm.Model():
    beta0 = pm.Flat('beta0')
    beta1 = pm.Flat('beta1')
    x1, a1, b1 = 55, 1, 0.577
    x2, a2, b2 = 75, 0.577, 1
    xbar = 69.6
    
    f1 = logistic(beta0 + beta1 * (x1 - xbar))
    f2 = logistic(beta0 + beta1 * (x2 - xbar))
   
    pi_beta1 = (f1**(a1-1)) * ((1-f1)**(b1-1)) * f1 * (1-f1)
    pi_beta2 = (f2**(a2-1)) * ((1-f2)**(b2-1)) * f2 * (1-f2)
    ll = np.log(pi_beta1) + np.log(pi_beta2) 
    log_prior = pm.Potential('joint_beta', ll)
    
    trace = pm.sample()

Plotting the joint distribution of \beta_0 and \beta_1 gives me:

image

3 Likes