Using LKJCorr together with MvNormal

I believe the the extra parameters are all orthogonal. They are essentially the diagonal values that determine the sd vector. Anecdotally, I’m currently sampling huge models that contain it with n=20 and everything runs just fine with decent convergence stats. So it should be fine.

I do agree it’s not great though, so if @jessegrabowski can crib it from TFP that will definitely be preferable. I’ll write up a proper issue report in the mean time

1 Like

Issue created:

Tagged both of you as well as this thread there as well

3 Likes

How does it behave if the dummy prior you give is a HalfFlat?

Haven’t tested it but I’m a bit skeptical of it being a good idea. Main reason is we are dividing by sd so both values close to zero and very large values can cause numerical instability. So having a prior keeping them in a goldylocks zone seems to make some theoretical sense.

Interesting question though. I’m guessing you ask because the logp is essentially a constant in that case so it is slightly less computation?

Yes, but more importantly I was interested in checking how “orthogonal” the two sets of parameters really are. For example I expect you would start to get divergences if they are “orthogonal” but not otherwise.

I do not suggest using it!