State of the Art samplers? Computational Lebesgue Integration techniques?

From some cursory look arounds, I saw that the parallel tempering algorithm was once called Adaptive Transitional Markov Chain Monte Carlo and then was rebased into SMC.

(Parallel Tempering in ``pymc3`` · Issue #821 · pymc-devs/pymc · GitHub)

But Parallel Tempering in ``pymc3`` · Issue #821 · pymc-devs/pymc · GitHub osvaldo points out that SMC and PT are different.

Is there intention to make a PT sampler? From some work I did yesterday, I showed that parallel tampering in JAX is faster even than numpyro sampler.

I’m new to this measure theoretic way of sampling, so is the a case of efficient linear algebra via jax, or is PT actually quite good for the more complex problems? I can run a 50 variable, 10000 sample regression in 11 seconds, not too bad!

Some more ideas come from here: Computational Lebesgue Integration

Has anyone had any luck with these different sampling techniques? Are they more efficient?

I am looking to find the fastest, most robust sampler. If it exists in a paper and just needs to be implemented, fine.

This has come up a few times (e.g., here and here). My understanding of parallel tempering/replica exchange is that it is particularly useful for multi modal posteriors rather than a generic approach to sampling (but could be very wrong). Performance of samplers is often difficult to compare, but there is obvious interest in any performance gains that may be available.