How to define priors in terms of only combinations of random variables?

Hello, I’m trying to define a model which uses something like a Boltzmann distirbution for priors. For example, suppose I have two 2D particles with random coordinates r1 =(x1, y1) and r2 = (x2, y2), where (x1, x2, y1, and y2) are random variables.

I want to specify a prior on these coordinates that looks something like:

exp(-H(x1, x2, y1, y2)/T) where H() is an arbitrary function of the coordinates. For example, it could compute the distance between particle 1 and 2 scaled by some factor.

Now here’s the important part - I have no priors on x1, x2, y1, y2 directly, only on functions of them. I’m a bit confused how to declare these random variables and implement priors like the one above, and I haven’t been able to dig up any examples.

I don’t have an answer, but here are some thoughts:

This seems like a situation where prior predictive sampling is going to be important. If you have some prior information on what the quantity z = exp(-H(x1, x2, y1, y2) / T) should be, you can set some “reasonable” priors on the positions and then generate samples of z to see if those translate to reasonable z values, and iterate on that process until you converge to something.

I’d challenge you to think more deeply about having no priors on the position coordinates though. Could you re-parameterize them, so that (x2, y2) is relative to (x1, y1), so you only have a prior on the distance between particles? Are positions strictly positive? Do particles all live on the unit square?

There is also a new package by the Arviz develoipers called preliz, which might be useful in your case. It is a toolbox for exploring priors and their implications on your model in an interactive way. Full disclaimer I haven’t personally tried it out, but it looks cool. There is also an automatic prior suggesting tool in pymc, called pm.find_constrained_prior – there’s a tutorial vid here.

pm.find_constrained_prior works with finding a prior for a single variable given constraints, but doesn’t 100% cover your case, since it can’t find a group of priors that jointly satisfy some constraint. But you could easily take inspiration from it and do something adapted to your multivariate situation. Write a function to take in a vector of parameters (for each of x1 x2 y1 y2), sample from them, compute z, then compute some moments of z that you want to match. Pass that whole mess to scipy.optimize to get some reasonable priors.

2 Likes