Input variable transforms with .dist()

I would like to scale the output of a beta distribution and include this as a component of a mixture model with pymc v4.1.7. I tried passing in a transform via pm.Beta.dist(a, b, transform=...) but this is not supported. I also tried just scaling the dist via d = 10 * pm.Beta.dist(a, b) but this leads to “Component dist must be a distribution created via the .dist() API, got <class ‘aesara.tensor.var.TensorVariable’>”

Full example below that mocks up what I would like to do:

with pm.Model() as model:    
    mu_r = pm.TruncatedNormal("mu_r", mu=2.16, sigma=0.5, lower=0)
    ln_std_r = pm.TruncatedNormal("ln_std_r", -6, 5, lower=-10, upper=-1)
    std_r = pm.Deterministic("std_r", pm.math.exp(ln_std_r))

    a = pm.TruncatedNormal("a", mu=2., sigma=2, lower=0, initval=2.)
    b = pm.TruncatedNormal("b", mu=5., sigma=2, lower=0, initval=5.)
    
    w = pm.Dirichlet('w', a=np.array([1, 1]))
    dist1 = pm.TruncatedNormal.dist(mu_r, std_r, lower=0)
    dist2 = 10 * pm.Beta.dist(alpha, beta)

    r = pm.Mixture(
        'r',
        w=w,
        comp_dists=[dist1, dist2],
        shape=len(data)
    )

Thanks for reading!

1 Like

The reason why Mixture requires pure distributions is that it needs to be able to resize them to match the Mixture size. You can however, create a custom DensityDist.dist that works as a scaled Beta (if you install the latest version of PyMC from the GitHub repository, not released yet):

import numpy as np
import pymc as pm

def scaled_beta_logp(value, alpha, beta):
  # PyMC will add the jacobian for the 10* scaling
  return pm.logp(10 * Beta.dist(alpha, beta), value)

def scaled_beta_random(alpha, beta, *, size, rng):
  return 10 * rng.beta(alpha, beta, size)

with pm.Model() as model:
    mu_r = pm.TruncatedNormal("mu_r", mu=2.16, sigma=0.5, lower=0)
    ln_std_r = pm.TruncatedNormal("ln_std_r", -6, 5, lower=-10, upper=-1)
    std_r = pm.Deterministic("std_r", pm.math.exp(ln_std_r))

    a = pm.TruncatedNormal("a", mu=2., sigma=2, lower=0, initval=2.)
    b = pm.TruncatedNormal("b", mu=5., sigma=2, lower=0, initval=5.)

    w = pm.Dirichlet('w', a=np.array([1, 1]))
    dist1 = pm.TruncatedNormal.dist(mu_r, std_r, lower=0)
    dist2 = pm.DensityDist.dist(
        a, b, 
        logp=scaled_beta_logp, 
        random=scaled_beta_random, 
        class_name="scaled_beta",
    )

    r = pm.Mixture(
        'r',
        w=w,
        comp_dists=[dist1, dist2],
        shape=(10,)
    )
1 Like

Thanks!