I have a dyadic model in which I am using random intercepts for the sender and receiver in each dyad. As you might expect, these intercepts are not stable: you could always add c to all sender effects and subtract it from all receiver effects. This makes the model particularly difficult to estimate, and gives me concerns about convergence overall.

If I were doing a fixed effects model, I would just exclude dummies for one of the categories, or set the value to 0 for that category. I am not sure how to set one value to 0 in PyMC3, or if there’s another options that would work well. Does anyone have any ideas?

Here is a toy model with data that exemplifies the problem.

Simulating the data:

```
sn = 10
rn = 30
n = sn*rn
senders = np.random.normal(5,5,sn)
receivers = np.random.normal(10,7,rn)
sx = np.random.binomial(1,.3,sn)
rx = np.random.binomial(1,.5,rn)
x = np.random.normal(20,10,n)
ri = list(range(rn))*sn
si = []
for i in range(sn):
si.extend([i]*rn)
x = pd.DataFrame({'x':x,'ri':ri,'si':si})
x['er']=np.random.normal(0,2,n)
y = []
for i,row in x.iterrows():
y.append(row.x*5 +row.er+
sx[int(row.si)]*7 + senders[int(row.si)]+
rx[int(row.ri)]*(-2) + receivers[int(row.ri)])
x['y']=y
```

The model:

```
with pm.Model() as test1:
mu_s = pm.Flat('sender_fe')
sig_s = pm.HalfCauchy('sig_s', beta = 2.5)
b_sx = pm.Flat('sx')
raw_s = pm.Normal('raw_s',mu=0,sd=1,shape = sn)
s = pm.Deterministic('s',
mu_s+
(b_sx*sx)+
(sig_s * raw_s)
)
mu_r = pm.Flat('receiver_fe')
sig_r = pm.HalfCauchy('sig_r', beta = 2.5)
b_rx = pm.Flat('rx')
raw_r = pm.Normal('raw_r',mu=0,sd=1,shape = rn)
r = pm.Deterministic('r',
mu_r+
(b_rx*rx)+
(sig_r * raw_r)
)
b_x = pm.Normal('b_x',mu=0,sd=100**2)
hat = pm.Deterministic('hat',
(b_x*x.x)+
s[x.si.values]+
r[x.ri.values]
)
sig = pm.HalfCauchy('sig',beta=2.5)
y = pm.Normal('y',mu=hat,sd=sig, observed = x.y)
```