I am new to pymc but have been playing around with some examples and was wondering if there is a neat way to combine these two concepts.
I have a model with an intercept and a number of predictors, and have been able to implement the regularised horseshoe for variable ‘selection’ following this article.
What I would like to do now is combine this with what is outlined in this article where the author speaks about variable interaction/correlation/covariance (sorry I couldn’t link to the exact spot in the article) and introduces the Cholesky decomposition to do so.
I can see how I could model this interaction without the variable selection, but I am struggling to see how I could do it both in the same model.
At this stage my model is not hierarchical (just modelling one group), but I would like to work towards that at a later date.
Any pointers would be very much appreciated.