I am new to PyMC3 and am interested in implementing Bayesian sparse logistic regression, likely with an L1 regularizer. Has anyone seen a Jupyter notebook somewhere that demonstrates this? If so, I’d really appreciate a pointer.
We already talked about that in this post.
In a nutshell, you won’t have a one-to-one equivalence in the Bayesian framework – just putting sensible priors is a form of regularization. So I think it’s not a binary situation in the Bayesian framework – either you regularize or you don’t. It’s more like a knob that you adjust – you penalize parameters more or less, depending on your priors and use case.
Hope this helps