Series of posts on implementing Hamiltonian Monte Carlo

Ah yeah, sorry: so MCMC is an algorithm for sampling from a(n unnormalized) log probability. We of course have efficient ways of generating samples from a normal or poisson or whatever distribution, but MCMC can still be used to sample from these.

More commonly, as you point out, MCMC is used for distributions where we do not have efficient samplers, especially in Bayesian inference, where p(theta | data) may be calculated. You could do that here if you wanted! For example, here is 1-dimensional linear regression:

# something like y = 2x + noise
x = np.arange(10) 
y = 2 * x + np.random.randn(10)

# fit a model y = ax + N(0, 1), with a ~ N(0, 1)
def neg_log_prob(mu):
  log_prior = jst.norm.logpdf(mu, 0., 1.)
  log_likelihood = jnp.sum(jst.norm.logpdf(y,  mu * x, 1.))
  return -(log_prior + log_likelihood)