Hi @maxsch95 ,
there are some scenarios where caching could speed things up, but as soon as there blocked sampling (CompoundStep) involved, a caching would violate the detailed balance.
In your example there’s blocked=True, so in every iteration the CompoundStep step method iterates over all blocks (2 Metropolis steppers assigned to a1 or a2) and steps them separately, before concluding the compound/blocked step.
I also noticed that the model is evaluated once if I load an existing trace using pm.load_trace. Why is that needed?
I’m not familiar with that one and there’s a high chance that we break/remove it at some point. If you have good reasons against return_inferencedata=True please let us know so we can try to iron out the blockers ![]()
cheers