Okay so I think I may have figured out how to include this value as a pm.Deterministic('l', rand_stm.normal(avg=mean, std=scale))
, but now I’m getting an error of:
<__main__.LogLikeWithGrad object at 0x7f3e9c2b0950> returned the wrong number of gradient terms.
I assume the number of gradient terms should correspond to the dimension of the parameter space I’m working with. I’ve checked that this is the case so I’m a little confused about what is wrong here. A similar problem is mentioned in this post, but I’m not sure if it was resolved. I’ll keep trying to figure out what I’m missing but if anyone has any perspective on how adding this deterministic would change the implementation of the gradient that would be really helpful.