Beta or sigmoid regression that can handle 0 and 1

Sorry if I got a bit carried away there with terminology. PPL is referring to probabilistic programming languages like PyMC3, but really Im referring to bayesian approaches that shine in cases of finite sampling.

Average precision is a scoring metric that approximates the area under the precision recall curve, see https://scikit-learn.org/stable/modules/generated/sklearn.metrics.average_precision_score.html for a primer. It ranges from 0 to 1, inclusive.

If you’re interested in why we use a fudge factor when doing regression in the range (0,1) check the paper ‘A Better Lemon Squeezer? Maximum-Likelihood Regression With Beta-Distributed Dependent Variables’ - basically it is to avoid cases where the link function is undefined.

Not interested in logistic regression here - there’s no binary outputs. The dependent variable continuously ranges between 0 and 1, inclusive.