I am working on a ~weekly series on modern implementations of gradient based samplers. Three parts are out so far, and I am happy to discuss them here, or in the issues of the github repo that accompanies the articles (minimc).
This series is strongly influenced by PyMC3’s implementation, and I am using it as a testbed of ideas for PyMC4 and improvements to PyMC3.
Please let me know your thoughts!
Part I: Exercises in automatic differentiation using autograd and jax Gives a background in automatic differentiation
Part II: Hamiltonian Monte Carlo from scratch Gives the basic implementation of HMC in around 20 lines of Python
Part III: Step Size Adaptation in Hamiltonian Monte Carlo Presents improvements to the basic algorithm, giving a ~10x speedup