Error implementing GLM submodule to full code syntax

I apologize if the following is repetitive or if it is already obvious to you but I would like to state precisely the reasons for the results that you are getting.

The reason you’re not getting divergences anymore is because you’re using a stronger prior with sd=10 rather than your first choice of tau=1e-6 which is equivalent to the very weak prior with sd=1000. I should clarify that the reason this model is a little pathological is because your data allows you to perfectly predict the type of movement with zero error. It has nothing to do with transformations of your variables - it is because every single outcome with value equal to 1 is associated with log_cantidad> 5 When this happens, the coefficients can zoom off to infinity without having any ill effect on the model performance and numerical instability is much more likely. In this scenario, only the strength of the prior prevents you from having divergences.

TL;DR: you have high weights because your data is weird and it has nothing to do with transformations.

2 Likes