How to Set Parameters Distribution For Bayesian Neural Network With PyMC3

Not simon_o here, but some thoughts:

  1. As mentioned previously, using a prior over the weights is equivalent to regularizing the cost function. In particular, a Gaussian prior over the weights leads to L2-regularization. Section 7.5 of “Machine Learing: A Probabilistic Perspective” by Kevin Murphy shows a good derivation of it. Here’s a good StackExchange explanation too.

  2. That’s right, HalfNormal is intentionally a truncated distribution that is strictly positive. If you really want to shift that boundary, you can just add some scalar amount to your random variable. For instance,
    mu = 5 + pm.HalfNormal('mu', sd=1) so that the boundary is now at 5 instead of 0.

  3. First, you may want to use sample_posterior_predictive() since sample_ppc() is deprecated. Second, you may find the hpd() function helpful, which calculates the credible interval for you from the samples.

  4. I don’t totally understand your question, but this example may be helpful. To my understanding (someone should correct me if I’m wrong), the function samples from the posterior distribution by running the generative model forwards. In other words, it takes the model you defined, samples from the prior, and carries on the computation through the model to output a final quantity representing a sample from the posterior.

Finally, not to be too pedantic, but you’re referring to these as “confidence intervals” when in fact these are credible intervals. It’s worth pointing out because the way these intervals are constructed and their resulting interpretations are very different.

1 Like