Regularizing the posterior of a parameter with the KL divergence from a different posterior

What you want is called a hierarchical model—I’d call it the bread-and-butter of Bayesian modeling. It doesn’t need KL divergence. Here’s a simple PyMC example:

and here’s my longer Stan example along the same lines and using the same classic Efron and Morris data:

The usual way to control the posterior is through the prior, unless you want to change the likelihood.

2 Likes