The problem turned out to be that my computer was having trouble when the number of cores was not specified to be 1. I managed to obtain the results I needed with the help of the code you provided. Apart from the issue with the core, I was wrong to use Y_obs = Normal(“Y_obs”, mu=mu, sigma=0.1, observed=data). What I should have been doing was using Y_obs = MvNormal(“Y_obs”, mu=mu, cov=cov, observed=data), where cov is created by propagating uncertainties from “c” to observed data (and including the uncertainty of observed data equal to 0.1 for each item in data). In the end, this allowed for obtaining posterior results that were exactly the same as for SMC-ABC (It should also be noted here, that true values of a, b and c are 43, 36 and 12 for this example). Here is the final code for NUTS (but it also works for Metropolis):

```
import numpy as np
import pymc as pm
import arviz as az
from pymc.step_methods.arraystep import BlockedStep
rng = np.random.default_rng()
class NormalNoiseStep(BlockedStep):
def __init__(self, var, mu, sd, size):
self.model = pm.modelcontext(None)
self.value_var = self.model.rvs_to_values[var]
self.vars = [self.value_var]
self.name = self.value_var.name
self.mu = mu
self.sd = sd
self.size = size
def step(self, point: dict):
draw = rng.normal(self.mu, self.sd, size=self.size)
point[self.name] = draw
return point, []
data = [151.2, 181.9, 219.7]
cov = np.array([[92.18, 61.41, 30.72],
[61.41, 40.92, 20.46],
[30.72, 20.46, 10.25]])
with pm.Model() as basic_model:
# Priors for unknown model parameters
a = pm.Normal("a", mu=50, sigma=5)
b = pm.Normal("b", mu=40, sigma=7)
# Initialize 'c' with a placeholder value. It will be updated by our custom step method.
c = pm.Normal("c", mu=0, sigma=1, shape=1) # Placeholder distribution
mu = [(a + 2*b + 3*c[0]), (2*a + 2*b + 2*c[0]), (4*a + b + c[0])]
Y_obs = pm.MvNormal("Y_obs", mu=mu, cov=cov, observed=data)
steps = [NormalNoiseStep(c, 15, 3, size=1), pm.NUTS(vars=[a, b])]
trace = pm.sample(tune=1000, draws=15000, step=steps, cores=1, chains=4)
print(az.summary(trace, kind="stats"))
az.plot_trace(trace);
```

For SMC-ABC the following code should produce the same posterior results (I didn’t check if posterior covariances between a and b are also the same but I’m guessing they should be):

```
import arviz as az
import matplotlib.pyplot as plt
import numpy as np
import pymc as pm
az.style.use("arviz-darkgrid")
data = [151.2, 181.9, 219.7]
rng = np.random.default_rng()
def normal_sim(rng, a, b, size=1):
c=rng.normal(15, 3, size=1)
x1=a+2*b+3*c
x2=2*a+2*b+2*c
x3=4*a+b+c
return [x1, x2, x3]
with pm.Model() as example:
a = pm.Normal("a", mu=50, sigma=5)
b = pm.Normal("b", mu=40, sigma=7)
s = pm.Simulator("s", normal_sim, params=(a, b), sum_stat="sort", epsilon=0.1, observed=data)
idata = pm.sample_smc(40000, cores=12, chains=1)
idata.extend(pm.sample_posterior_predictive(idata))
az.plot_trace(idata, kind="rank_vlines");
print(az.summary(idata, kind="stats"))
```