Shared standard deviation in pymc.LKJCholeskyCov

Hello,

I’m interested in using pymc.LKJCholeskyCov to estimate the covariance matrix for varying intercepts in a model that I am building. To use this function, we are required to provide the dimensionality of the covariance matrix n as well as a distribution of the standard deviations (using the sd_dist parameter).

What I would like to do for my model is use the assumption that the varying intercepts have the same standard deviation. As a result, the only source of covariance will be from the correlations amongst the intercepts. Is it possible to enforce such an assumption when using pymc.LKJCholeskyCov. I am aware that there is another option to use the pymc.LKJCorr prior instead which would allow me to define my covariance matrix in the way described above. However, I am having trouble getting the pymc.LKJCorr prior to work when using more than a few varying intercepts. In addition, I would like to use a non-centered parameterization for the intercepts.

Do you have any suggestions on how I can use pymc.LKJCholeskyCov in the way that I desire or will I need to do some manual work to have the intercepts share the same standard deviation?

Thank you for your help!

I think that might be a degenerate condition the way the LKJCholeskyCov is defined but maybe not. Perhaps @aseyboldt can provide more clear advice.