I have defined a “clipping” distribution transformation like this:
from pymc3.distributions.transforms import ElemwiseTransform import aesara.tensor as at import numpy as np class MvClippingTransform(ElemwiseTransform): name = "MvClippingTransform" def __init__(self, lower = None, upper = None): if lower is None: lower = float("-inf") if upper is None: upper = float("inf") self.lower = lower self.upper = upper def backward(self, x): return x def forward(self, x): return at.clip(x, self.lower, self.upper) def forward_val(self, x, point=None): return np.clip(x, self.lower, self.upper) def jacobian_det(self, x): # The backwards transformation of clipping as I've defined it is the identity function (perhaps that will change) # I have an intuition that the jacobian determinant of the identity function is 1, so log(abs(1)) -> 0 return at.zeros(x.shape)
And I have applied it to a MvNormal with an LKJ Cholesky prior like this:
import importlib, clipping; importlib.reload(clipping) with pm.Model() as m: # Taken from https://docs.pymc.io/pymc-examples/examples/case_studies/LKJ.html chol, corr, stds = pm.LKJCholeskyCov( # Specifying compute_corr=True also unpacks the cholesky matrix in the returns (otherwise we'd have to unpack ourselves) "chol", n=3, eta=2.0, sd_dist=pm.Exponential.dist(1.0), compute_corr=True ) cov = pm.Deterministic("cov", chol.dot(chol.T)) μ = pm.Uniform("μ", -10, 10, shape=3, testval=samples.mean(axis=0)) clipping = clipping.MvClippingTransform(lower = None, upper = upper_truncation) mv = pm.MvNormal("mv", mu = μ, chol=chol, shape = 3, transform = clipping, observed=samples) # , observed = samples trace = pm.sample(random_seed=44, init="adapt_diag", return_inferencedata=True, target_accept = 0.9) ppc = pm.sample_posterior_predictive( trace, var_names=["mv"], random_seed=42 )
(upper truncation is a numpy array)
Now, I have generated simulated data by defining a covariance matrix for a multivariate normal distribution and applying clipping to it, to get this:
But when I sample from the PPC, there is no clipping. I’d embed a second image but new users are not aloud to.
Even if I define the clipping to be [0,0,0], it still doesn’t work.
Why isn’t the PPC (or the parameter sampling, for that matter) reflecting the clipping transformation?