Sampling from a model where the observed data are numerically transformed from a RV

I’m trying to sample from a little model where we want to learn ‘true’ four digit code given some data which are noisy observations:

import numpy as np
import pymc as pm
import aesara.tensor as at

true = np.array([1, 2, 3, 4])

observations = true[:, np.newaxis] + np.random.poisson(2, size=(4, 10))

with pm.Model() as model:
    code = pm.DiscreteUniform("true", 1, 9, size=(4,))
    shuffle = pm.Normal("shuffle", mu=code, sigma=4, size=(4, 10)).round()
    observed = pm.Deterministic("observed", at.mod(code + shuffle, 9), observed=observations)
    idata = pm.sample()

but the pm.Deterministic doesn’t accept observed values and I can’t apply the observations directly to the data because of the round and mod transformations.

Is it possible to learn this sort of model with pymc?

You would need to either

  1. figure out a likelihood for your data. I don’t see how you can do that for a modulo operation since there are possibly infinite combinations that could yield the same observation (although exponentially less likely). Perhaps you can reason on circular space and use something like a VonMises likelihood?

  2. Use a likelihood-free method like ABC: Approximate Bayesian Computation — PyMC example gallery