Joint priors when implementing conditional means prior on a log-odds model (Binomial)

Update with solution
I spoke with my statistician who is well-versed in SAS and he sent me some code for the example in the paper I linked above.

*** empty dataset b/c proc mcmc needs a dataset to run;
data x;
run;
 
proc mcmc nmc=100000 nbi=0 outpost=ox data=x;
	  x1 = 55; a1=1; b1=0.577;
	  x2 = 75; a2=0.577; b2=1;
	  xbar=69.6; *** average temperature from 23 oring tests

	  array betacoeffs{2} beta0 beta1;
 
	  **** equation 2 for x1, a1, and b1;
	  pi_beta1 = (   logistic(beta0 + beta1*(x1-xbar)) **(a1-1) ) *
					 ((1-logistic(beta0 + beta1*(x1-xbar)))**(b1-1) ) *
						logistic(beta0+beta1*(x1-xbar))*(1-logistic(beta0+beta1*(x1-xbar)));

	  **** equation 2 for x2, a2, and b2;
	  pi_beta2 = (   logistic(beta0 + beta1*(x2-xbar))**(a2-1) ) *
					 ((1-logistic(beta0 + beta1*(x2-xbar)))**(b2-1) ) *
						logistic(beta0+beta1*(x2-xbar))*(1-logistic(beta0+beta1*(x2-xbar)));
 
	  log_prior = log(pi_beta1) + log(pi_beta2);
	  prior betaCoeffs ~ general(log_prior);
	  parms betaCoeffs {-0.1 0.1};
	  model general(0);
run;

I managed to convert this to pyMC3 code:

def logistic(x):
    return np.exp(x) / (1 + np.exp(x))


with pm.Model():
    beta0 = pm.Flat('beta0')
    beta1 = pm.Flat('beta1')
    x1, a1, b1 = 55, 1, 0.577
    x2, a2, b2 = 75, 0.577, 1
    xbar = 69.6
    
    f1 = logistic(beta0 + beta1 * (x1 - xbar))
    f2 = logistic(beta0 + beta1 * (x2 - xbar))
   
    pi_beta1 = (f1**(a1-1)) * ((1-f1)**(b1-1)) * f1 * (1-f1)
    pi_beta2 = (f2**(a2-1)) * ((1-f2)**(b2-1)) * f2 * (1-f2)
    ll = np.log(pi_beta1) + np.log(pi_beta2) 
    log_prior = pm.Potential('joint_beta', ll)
    
    trace = pm.sample()

Plotting the joint distribution of \beta_0 and \beta_1 gives me:

image

3 Likes