Multi-Nonlinear Robust Regression

I was able to reproduce the problem locally thanks to your code. I don’t think there is anything logically wrong if your model per se. It’s just a hard inference problem - the functional form you are trying to estimate with multiple nested exponents is tricky and your priors are working against finding the correct parameters.

The cauchy prior on amp0 is pretty tight around 0. The real parameter is fairly far away, at -14 or 14. The tails of the cauchy are pretty wide too so once we get far away from 0, there is a huge range of parameters that have nearly equal prior probability. So if the chain doesn’t locate -14 quickly, it will start looking for weird combinations of parameters that can work. It turns out that if you crank down the nu parameter on the StudentT, there a bunch of weird combinations of parameters that work. I found that when my chains failed to find -14, they would also anchor on nu parameters at nearly 1. This causes the likelihood distribution to become nearly flat - getting the mean close to the data doesn’t improve the likelihood very much when the studentT is superflat.

Why does it show up in the negative case but not the positive case? Well, sometimes it does show up in the positive case. So i think just bad luck or bad random seeds were leading to this odd behaviour.

There are a bunch of ways you might fix this. The priors could give amp0 a better clue of where to look. Or you could switch away from a studentT outcome distribution. Your data isn’t very noisy so the extra flexibility of the studentT hurts you rather than helps you.

2 Likes