Thanks for the example using xarray_einstat! Unfortunately I’m trying to calculate the log_likelihood on a dataset consisting of X1, X2, Y where the length of dataset is 591591. After sampling with only one chain of 1000 samples, and then casting the constant_data and posterior xarrays from float64 to float32, the system needs shape (1, 1000, 591591, 591591) = 1.24 PiB of memory to calculate the step mu = post["α"] + post["β_1"] * const["X_1"] + post["β_2"]*const["X_2"]. Unless there’s another way to break this into chunks?