Hamiltonian MC with different variable dimensions

Hello,
I am trying for some time now to use the pymc3 and the hmc sampler in order to create a posterior sample of some model parameters.
My model takes a bunch of imput parameters, all of them initialized via pm.Uniform(…).
The problem is, they have different ranges (expected sigma of 0.01 for parameter A vs. 5.0 for parameter B).
I can provide a guess of the parameter covariance matrix

What is the proper way to initialize the sampler in this case? How do I make the sampler move in both dimensions according to its scaling?

At the moment i try something like this:
step = pm.HamiltonianMC(scaling = Cov[::-1,::-1], is_cov = False, . . . )
(I discovered that it inverts the variable order during initialization, thus the [::-1,::-1])

I would really appreciate if someone could help me out and tell me, what the proper way is to do something like that.

Cheers
N40

Not sure what is your intentions, but in general we discouraged user to use hmc directly, as it need extra care of choosing hyper parameters; and since internally we are using slightly different parameterization, finding the right parameters and its mapping to the input arg is not trivial - it just ends up really frustrating.

So I suggest you to use pm.sample with the default option, and tuning of hyper parameters like mass matrix and step size will be tuned to optimal automatically.

If for some reason you really need to use HMC, you can run the default sampler, and get the hyper parameters from the trace.

My intention is to use hmc as an alternative to Metropolis Hastings.
I allready tried to dig into the hmc code.

Does that mean, that the mass matrix is adapted during the tuning session?

Do i need to run Hmc, NUTS or Metropolis in order to get an estimate for the mass matrix from the trace?

If my covariance matrix (scaling matrix) is diagonal, does it simplyfy the problem?
I tried to understand the documentation and the code, but i dont understand the sentence

The inverse mass, or precision matrix. One dimensional arrays are interpreted as diagonal matrices. If is_cov is set to True, this will be interpreded as the mass or covariance matrix.

Could you help me by clarifying it?

In that case, dynamic hmc like NUTS is your friend, just called pm.sample(1000, tune=1000) and let pymc3 try taking care of everything :wink:

Yes

Well, you dont need to, as the mass matrix adaptation like pymc3 and Stan use a somewhat complex block sleep-wake update that is a pain to implement yourself (plus we already implemented and tested it)

Yes, in fact in pymc3 currently we are using diagonal mass matrix in the default

As for the arg to init the potential, i think this code snippet should clarify a bit:

But again, DO NOT use this option as it is suboptimal!

1 Like