Estimating an ideal point model


#1

Hello everyone,

I am new to Bayesian estimation and Pymc3. I am working on a typical ideal point model, which have massive number of parameters to estimate. I did not get converged results from my coding. Do you have suggestion on how I should improve?

I have d=2, n=500, m=42, and the observed variable y=500*42 that only takes value 1 or -1.

def likelihood(x1,x2,x,a,b,y):

x_pend = tt.concatenate([x1,x2,x],axis=0)
atile = tt.tile(a.T,(n,1))
linear_comb = pm.math.dot(x_pend,b.T)+atile
norm_prob = (1/pm.math.sqrt(2*np.pi))*pm.math.exp(-0.5*pm.math.sqr(linear_comb))
u = (pm.math.abs_(y+0.5)-0.5)*pm.math.log(norm_prob)+(pm.math.abs_(y-0.5)-0.5)*pm.math.log(1-norm_prob)
return pm.math.sum(u)

with pm.Model() as irm_model:

x1 = pm.Normal('x1',mu=1,sd=0.001,shape=(1,d))
x2 = pm.Normal('x2',mu=-1,sd=0.001,shape=(1,d))
x = pm.Normal('x',mu=0,sd=1,shape=(n-2,d))
b = pm.Normal('b',mu=0,sd=5,shape=(m,d))
a = pm.Normal('a',mu=0,sd=5,shape=(m,1))

like = pm.Potential('like',likelihood(x1,x2,x,a,b,y))

trace = pm.sample()

I got the following message

NUTS: [a, b, x] Sampling 2 chains: 100%|██████████| 2000/2000 [00:25<00:00, 77.70draws/s] 
There were 455 divergences after tuning. Increase `target_accept` or reparameterize.
There were 432 divergences after tuning. Increase `target_accept` or reparameterize.
The gelman-rubin statistic is larger than 1.05 for some parameters.
This indicates slight problems during sampling. The estimated number of effective samples is smaller than 200 for some parameters.

#2

I would try to play with the scale of x1 and x2, sd=0.001 seems to be a very small scale for me.