Here’s a little simulation to illustrate @daniel-saunders-phil’s point.
The python program generates a deterministic Markov chain x from a sinusoid over just a few cycles, so very low ESS. Then it generates a second chain y centered around the values in x with normal noise of scales 0.1, 1, and 10. You’ll see that as the noise goes up, the linear correlation goes down and the ESS decouples, just as @daniel-saunders-phil said it would.
This is just a simulated version of what happens when fitting a (centered) hierarchical model. The mean of the two chains is identical because the second chain is just a noisy value of the first, but the correlations have varying values depending on how much noise there is in the second. When the variance in y is large compared to the variance in x, you see the ESS get out of line and correlation go down, even though they are linked in exactly the same way, just at different scales.
import numpy as np
import pandas as pd
import arviz as az
import xarray as xr
def generate_series(n_points, cycles, noise_sd, seed=123):
rng = np.random.default_rng(seed)
t = np.arange(n_points) / n_points
x = np.sin(2.0 * np.pi * cycles * t)
y = x + rng.normal(loc=0.0, scale=noise_sd, size=n_points)
return x, y
def build_idata(x, y):
x_chain = x[None, :]
y_chain = y[None, :]
ds = xr.Dataset(
data_vars={"x": (("chain", "draw"), x_chain), "y": (("chain", "draw"), y_chain)},
coords={"chain": [0], "draw": np.arange(x.shape[0])},
)
return az.InferenceData(posterior=ds)
def main():
n_points = 10000
cycles = 1.5
for noise_sd in {0.1, 1, 10}:
x, y = generate_series(n_points=n_points, cycles=cycles, noise_sd=noise_sd, seed=2025)
print(f"NOISE SD: {noise_sd}\n")
print(f"CORRELATION: {np.corrcoef(x, y)[0, 1]:0.2f}\n")
idata = build_idata(x, y)
summary_df = az.summary(idata)
print(summary_df)
main()
And here’s the output:
$ python sin-corr.py
NOISE SD: 0.1
CORRELATION: 0.99
arviz - WARNING - Shape validation failed: input_shape: (1, 10000), minimum_shape: (chains=2, draws=4)
mean sd hdi_3% hdi_97% mcse_mean mcse_sd ess_bulk ess_tail r_hat
x 0.212 0.675 -0.960 1.00 0.27 0.092 7.0 40.0 NaN
y 0.213 0.681 -1.013 1.08 0.27 0.092 7.0 45.0 NaN
NOISE SD: 1
CORRELATION: 0.55
arviz - WARNING - Shape validation failed: input_shape: (1, 10000), minimum_shape: (chains=2, draws=4)
mean sd hdi_3% hdi_97% mcse_mean mcse_sd ess_bulk ess_tail r_hat
x 0.212 0.675 -0.96 1.000 0.270 0.092 7.0 40.0 NaN
y 0.216 1.207 -2.09 2.455 0.262 0.040 22.0 119.0 NaN
NOISE SD: 10
CORRELATION: 0.06
arviz - WARNING - Shape validation failed: input_shape: (1, 10000), minimum_shape: (chains=2, draws=4)
mean sd hdi_3% hdi_97% mcse_mean mcse_sd ess_bulk ess_tail r_hat
x 0.212 0.675 -0.960 1.000 0.270 0.092 7.0 40.0 NaN
y 0.252 10.072 -18.596 18.947 0.106 0.073 9097.0 9021.0 NaN
P.S. Thanks to GPT-5 for help with ArviZ—it’s a game changer for me as I can never remember the incantation in that build_idata code. I did review and tighten the final code up a bit and added the correlation output.