Glad to I see I wasn’t absurdly far off then 
As regards your last question, halving whatever prior is set on the remaining points is appropriate (Half-normal, half-cauchy,…) to impose symmetry.
However! It has since occurred to me (indeed demonstrating how alien it feels to not care about the center), that you’ve only solved half of the reflective-symmetry problem if you’ve set one point of the origin, and another to (0, y). The solution is then still subject to mirroring around X. Hence I came to the following solution, now fully embracing the discomfort, which I now find overall more elegant.
You could set two (almost) random points at arbitrary locations (1,0) and (0,1) respectively, make the scale of the solution a RV with a non-informative prior, and nudge the center of the mass to be firmly below the line defined by y=1-x with some clever, and otherwise very uninformative prior. In fact, I would specifically implement a Potential that is -inf if the mean mass’ coordinate has an y > 1-x, and 0 otherwise. You must then only worry that the center’s area of likelihood does not overlap with y=1-x, hence that its sufficiently triangular with respect to the two points. (hence the almost random). Extra caveat: if the distance between these very specific two points happens to be very uncertain your traces will make waves, as the scale RV goes up and down.
But the solution is otherwise truly unique.
However 2! If you intend to use batches, this doesn’t work at all, as you keep repeating two points in every batch which can significantly bias the model. Then the second approach you suggest seems eminently sensible. I only wonder how noisy something like the angle of the vector sum would be within-sample or from batch to batch. You’re entering experimental territory as far as I’m concerned but I like it.
Try both?