So if I understand you correctly, you want c and d to be source of random noise and not like priors that update and not use SMC-ABC. In that case, there is only one way (that I am aware of) to achieve this and this is defining your own updating step for c and d so that they are not updated like normal priors. This was discussed recently in

See also the links there. As a starters you can try the following:

```
class NormalNoiseStep(BlockedStep):
def __init__(self, var, mu, sd, size):
model = pm.modelcontext(None)
value_var = model.rvs_to_values[var]
self.vars = [value_var]
self.name = value_var.name
self.mu = mu
self.size = size
self.sd = sd
def step(self, point: dict):
draw = np.random.normal(self.mu, self.sd, size=self.size)
point[self.name] = draw
return point, []
```

In your model then you would use this as

```
with pm.Model() as model:
a = pm.Uniform("a", lower=0.1*16, upper=3*16)
b = pm.Uniform("b", lower=0.1*14, upper=3*14)
c = pm.Normal("c", mu ,sd, size=N)
d = pm.Normal("d", mu ,sd, size=N)
steps = [NormalNoiseStep(c, mu, sd, N), NormalNoiseStep(d, mu, sd, N)]
#you can now use c and d as any tensor in your operations
trace = pm.sample(**sample_parameters, step=steps)
```

This is outside the boundaries of regular Bayesian modelling though so stuff like convergence statistics, r_hat may or may not be meaningful in this context. Perhaps a good thing to do is indeed cross-compare results with SMC-ABC. Never used this method in detail so don’t have intuition about it.