Hello,

I am new to pymc and I am currently wanting to use the package to infer parameters. I want to construct the model dynamically, i.e. the number of RVs depends on the number of particles in the given data set. But for datasets with many particles, the for loop results in the following error:

`fatal error: bracket nesting level exceeded maximum of 256`

A minimal example to reproduce the problem would be the following.

Note that for the sake of simplicity, all particles are assumed to be completely independent of each other, and therefore the code looks a bit strange / or you might think I am a bit stupid. However, in my case of interest, the initial value (and therefore mu) of each particle depends on the final value of another particle.

```
import numpy as np
import pymc as pm
# assume that all particles are observed from x=0 to x=1
x = np.linspace(0, 1)
# number of particles
# Note, N=100 is large enough to generate the fatal error, whereas for example
# N=50 would still work fine for me
keys = [f'{i}' for i in range(N)]
# generate observed data
y_observed = {}
for key in keys:
# for simplicity, assume that all particles follow the same exponential
# growth with normal noise and have the same true growth rate mu = 0.8 and
# sigma = 0.1
y_observed[key] = np.random.normal(loc=np.exp(0.8 * x),
scale=0.1)
with pm.Model() as model:
mu, y = {}, {}
# assume that sigma should be shared among all particles
sigma = pm.Normal('sigma')
# add a rv (mu) and the likelihood of the observation (y) for each particle
for key in keys:
mu[key] = pm.Normal(f'mu_{key}', mu=1)
y[key] = pm.Normal(f'y_{key}', mu=np.exp(mu[key] * x), sigma=sigma, observed=y_observed[key])
pm.sample()
```

I am grateful for any suggestions on how I could create such a large dynamic model

PS: So far, by using the` PYTENSOR_FLAGS='optimizer=fast_compile'`

, I have not been able to reproduce the bug, it has worked fine.

PPS: Could I use `scan`

for this? In case the solution is already given in Declaring Priors using Loops for State Space Models, I am sorry for not being able to transfer it.