Reading through Radford Neal’s 2012 review of HMC, “MCMC Using Hamiltonian Dynamics,” I got stumped by the following:
In Bayesian statistics, the posterior distribution for the model parameters is the usual focus of interest, and hence these parameters will take the role of the position, q. We can express the posterior distribution as a canonical distribution (with T = 1) using a potential energy function defined as follows:
U(q) = -\log [ \pi(q) L(q|D)]where \pi(q) is the prior density, and L(q|D) is the likelihood function given data D.
Am I wrong in thinking at the equation in \log[\cdot] should be the posterior distribution P(q|D) and thus \pi(q) L(D|q). Note that I am using the pre-print available on arXiv, so it may be different in the actual book Handbook of Markov Chain Monte Carlo.
Thank you for any clarification!