Unable to resolve inputs when sampling from posterior predictive

Hello fellow Bayesians,

I would like to get your thoughts on the following problem I’m experiencing.

My model should sample from a Weibull distribution, where I select the index of the prior by using some external variable X (which is between 0 and 8). Like this:

X = theano.shared(x)
if NEW_POSTERIOR:
    with pm.Model() as model:

        # Priors for unknown model parameters
        alpha = pm.TruncatedNormal('alpha', mu=0.5, sd=10, lower=0.01, shape=9)
        beta = pm.TruncatedNormal('beta', mu=40, sd=100, lower=0.01, shape=9)
        loc = pm.Uniform('loc', lower=0.01, upper=100)

        # Likelihood (sampling distribution) of observations
        alphat = pm.Deterministic('alphat', alpha[X])
        betat = pm.Deterministic('betat', beta[X])
        Y_obs = pm.Weibull('Y_obs', alpha=alphat, beta=betat, observed=Y+loc)
        step = pm.NUTS()
        trace = pm.sample(10, cores=8, step=step, tuning=100)

X.set_value(x)
ppc = pm.sample_posterior_predictive(trace, samples=500, model=model)

The inputdata have the following types: x= {ndarray}, X={TensorSharedVariable}, Y={ndarray}
Sampling from the model goes without any problem at all. However, when I try to get samples from the posterior predictive, the following error happens:

Traceback (most recent call last):
  File "C:\Program Files\JetBrains\PyCharm Community Edition 2018.3.5\helpers\pydev\pydevd.py", line 1741, in <module>
    main()
  File "C:\Program Files\JetBrains\PyCharm Community Edition 2018.3.5\helpers\pydev\pydevd.py", line 1735, in main
    globals = debugger.run(setup['file'], None, None, is_module)
  File "C:\Program Files\JetBrains\PyCharm Community Edition 2018.3.5\helpers\pydev\pydevd.py", line 1135, in run
    pydev_imports.execfile(file, globals, locals)  # execute the script
  File "C:\Program Files\JetBrains\PyCharm Community Edition 2018.3.5\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, in execfile
    exec(compile(contents+"\n", file, 'exec'), glob, loc)
  File "C:/Users/xxx/run_scripts/xxx/weibull_bayesian_categorical_TESTING.py", line 63, in <module>
    ppc = pm.sample_posterior_predictive(trace, samples=500, model=model)
  File "C:\Users\xxx\AppData\Local\Continuum\anaconda3\envs\xxx\lib\site-packages\pymc3\sampling.py", line 1135, in sample_posterior_predictive
    values = draw_values(vars, point=param, size=size)
  File "C:\Users\xxx\AppData\Local\Continuum\anaconda3\envs\xxx\lib\site-packages\pymc3\distributions\distribution.py", line 388, in draw_values
    raise ValueError('Cannot resolve inputs for {}'.format([str(params[j]) for j in to_eval]))
ValueError: Cannot resolve inputs for ['Y_obs']

I would really appreciate your feedback on my problem.

Thank you very much.

The value passed in the observed kwarg cannot be a random variable. You are supplying Y+loc which ends up being a symbolic tensor.

2 Likes

Is it reasonable to supply a tensor as an observed if the model is only used for posterior sampling, and never for posterior predictive sampling? I have a model that uses a tensor as an observed, and seems to work, until I applied pm.fast_sample_posterior_predictive() to it.

Yes, that would work. I remember i saw an example by @junpenglao where he passed in a symbolic tensor as the observed value of a distribution. He had to manually add the jacobian though. Maybe he could share that reference?

1 Like

yeah passing a tensor that are computation result of some random variable is fine, as they are also represented as a tensor, but you need to account for jacobian when you do inference as transformation of a Random Variable is more than just the mapping of the value.

Here is a quick example: All-that-likelihood-with-PyMC3/Regression with a twist.ipynb at master · junpenglao/All-that-likelihood-with-PyMC3 · GitHub

1 Like