Labeled coords and dims in hierarchical group setting with different sizes

Hi all,

I am facing a similar issue. I have very similar data with 77 subjects, 6 conditions per subjects, and 24 trials per condition with bernoulli responses (0s and 1s). My model is very simple with basically a linear equation + softmax to convert the output of the equation to Bernouli probability. Obviously, eventually, I want to build a hierarchical prior because the conditions are within-subject so is one hierarchy lower. But right now I’m just writing all parameters as one matrix with dimension (ncond x nsubj) i.e. (6 x 77). However, I get an error of dimension mismatch. Can anyone help me with this, please? I think I wrote it straight by the template.

subj_id_idxs, subj_id_unique = pd.factorize(data.id, sort=True)
condition_idxs, condition_unique = pd.factorize(data.conditions, sort=True)
coords = {
"subj_id_unique": subj_id_unique,
"obs_id": np.arange(data.shape[0]),
"conditions": condition_unique,
}
with pm.Model(coords=coords) as m:
    subj_id_idxs = pm.ConstantData("subj_id_idxs", subj_id_idxs, dims="obs_id")
    condition_idxs = pm.ConstantData("condition_idxs", condition_idxs, dims="obs_id")
    block_trial_num = pm.ConstantData("block_trial_num", data.block_trial_num.values, dims="obs_id")

    R0 = pm.ConstantData('mean_prior_reward_0', data.mean_prior_reward_0.values, dims="obs_id")
    R1 = pm.ConstantData('mean_prior_reward_1',data.mean_prior_reward_1.values, dims="obs_id")
    I0 = pm.ConstantData('information_0',data.information_0.values, dims="obs_id")
    I1 = pm.ConstantData('information_1',data.information_1.values, dims="obs_id")
    s0 = pm.ConstantData('location_0',data.location_0.values, dims="obs_id")
    s1 = pm.ConstantData('location_1',data.location_1.values, dims="obs_id")
    
    choice = pm.ConstantData('choice',data.first_free_choice.values, dims="obs_id")
    
    inv_sig = pm.HalfNormal(name="beta", sigma=10, dims=('conditions', 'subj_id_unique')) #typical softmax temperature prior
    alpha = pm.Normal('alpha', mu=0, sigma=10, dims=('conditions', 'subj_id_unique')) #typical linear regression weight prior.
    B = pm.Normal('B', mu=0, sigma=10, dims=('conditions', 'subj_id_unique'))
    
    v1 = pm.Deterministic('v1',R0-R1 + alpha*(I0-I1) + B*(s0-s1))
    # print(pm.draw(I0-I1).shape)
    p1 = pm.Deterministic('p1', pm.math.sigmoid(-v1*inv_sig))

    obs = pm.Bernoulli("obs",p=p1, observed=choice, dims='obs_id')