Contributing Zero-Inflated Beta Distribution

Hello PyMC community,

I am planning to contribute a Zero-Inflated Beta distribution to PyMC. Such a contribution would be useful for modeling proportions/rates on [0,1) with structural zeros, common in healthcare and ecological applications.

Background:

Proposed Implementation:

  • Location: pymc/distributions/continuous.py
  • Pattern: Follow ZeroInflatedPoisson and HurdleGamma (direct logp, NOT pm.Mixture)
  • Parameterization: mu/kappa (mean/precision) like brms

Questions for the community:

  1. Do you prefer alpha/beta or mu/kappa parameterization?
  2. Should I implement just zero-inflated or also zero-one-inflated beta (ZOIB)?
  3. Preferred name: ZeroInflatedBeta or BetaZI?
  4. Any specific design patterns I should follow?

I have studied ZeroInflatedPoisson and HurdleGamma implementations.
Would greatly appreciate a mentor/review for this contribution!

Thanks,
Noah Shin

4 Likes

@ricardoV94 @twiecki @tcapretto Hey, all. @nshin0911 is an intern for my company and is a very talented teenage mathematician. He wanted to apply his mathematical skills to code , and I gave him the project to add some additional distributions to PYMC (and hopefully to BAMBI) that I think would be helpful for the community. I would appreciate any help you could provide him. I think that he would be a very productive member of the PYMC community.

3 Likes

Both, usually parametrizations are converted to a canonical representation internally, so it’s a very light extra work to make it work

I guess we can go to the Zero-One-Inflated directly but allow zero_p / one_p to be zero and therefore implicitly end up with zero-inflated beta or one-inflated beta

We tend to prefer verbose more readable names, so ZeroOneInflatedBeta?

Try to do what Hurdle is doing, and reuse most of the code that’s there to reduce tech burden. If there’s code in it that you need, but it’s not accessible from where you want it to go, refactor it out.

Thanks @zweli. Personally vouching for someone is really helpful these days, as there’s an uptick in LLM-generated work from people who have low-interest in the project, and we have to be more selective about who we devote mentoring effort to.

Will keep @nshin0911 in the radar. If some work goes unnoticed for too long don’t hesitate in tagging me personally, either of you.

2 Likes

Thank you. I appreciate it.