Bernoulli observations of variable length with a shape parameter

I am attempting to estimate the conversion rate over a number of groups (in the 1000s).

My data is stored as an array of arrays, where each inner array is of variable length, as each group has a different number of observations.

In this issue, the observations were all of the same length and could therefore be stacked. In this issue, a new variable was created for each observation.

My current solution has been to do this:

traces = []

for i in range(len(obs_data)):
    with pm.Model() as model:
        p = pm.Uniform('p', 0, 1)
        obs = pm.Bernoulli('obs', p, observed=obs_data[i])

        trace = pm.sample()
    
    traces.append(trace['p'])

which does work but is inefficient. Am I using the wrong tool for this job?

At the very least, you should compile the model once and use pm.Data or theano.shared to update the observed so you dont need to compile the model again.
On a higher level, you use Binomial instead of Bernoulli, which your model translated to:

n = np.asarray([len(k) for k in obs_data])
k = np.asarray([sum(k) for k in obs_data])
with pm.Model() as model:
    p = pm.Uniform('p', 0, 1, shape=len(obs_data))
    obs = pm.Binomial('obs', n, p, observed=k)

    trace = pm.sample()