Hi Troy,
Thanks for participating! From a quick glance, I think I’d have two comments:
- I’d use more informative priors that uniform distributions, especially for GPs. This will help guarding against overfitting and just help the sampler overall – here are good recommendations for priors. For these reasons, uniforms are very rarely a good choice, and they are actually very informative at the boundaries – they give virtually infinite weight to the boundaries; see this for details.
- Any reason you’re not using PyMC3’s GP module? It is optimized for performance and will take care of a lot of technical details for you.
Hope this helps 