Multivariate Laplace Prior

You could build a MvLaplace by analogy of the MvNormal. This is probably not equivalent to the “textbook” one (probably there are more than one…), but could be useful.

You do this by defining a vector of Univariate Laplace and doing a matrix transformation to introduce correlations. Something like:

import pytensor.tensor as pt
import pymc as pm

from pymc.distributions.shape_utils import rv_size_is_none

def mv_laplace_dist(mu, cov, size):
  if rv_size_is_none(size):
    size = mu.shape[-1]
  else:
    size = pt.concatenate([size, [mu.shape[-1]]])
   
  chol = pt.linalg.cholesky(cov)
  x = pm.Laplace.dist(mu=0, b=1, size=size)
  return mu + (chol @ x[..., None]).squeeze(-1)

mu = [1, 2]
cov = [[1, 0.5], [0.5, 2]]

mv_laplace = pm.CustomDist.dist(
    mu,
    cov,
    dist=mv_laplace_dist, 
    shape=(2,),
    signature="(m),(m,m)->(m)",
)

Colab Notebook: Google Colab
Docs: CustomDist — PyMC dev documentation

Note: for logp (mcmc) sampling, it needs the next release version of PyMC, which you can for now install directly from github.

For actual use, you probably want to parametrize directly with the cholesky, in which case you can skip the pt.linalg.cholesky call in the dist function.

1 Like