Prior predictive sampling with transformed RV

I see. It seems like the prior and posterior predictive sampling are not “first class citizens” (a term that programming languages people use) in the PyMC3 language.

One thing remains unclear to me: if we have “internal” transformed variables (e.g., hyper-parameters in a hierarchical model), will predictive samples of the observed variables be only untransformed, or will they also be wrong?

E.g., if one has a truncated normal RV as a Free RV, and it is used to give the mean of a child variable, will that child variable’s value be wrong, also? Or does the theano network ensure that the value that goes to the child variable will be transformed?

Like this:
I = [1, 2, 3]
μ ~ TruncatedNormal( μ=I, σ=1)
o ~ TruncatedNormal(μ=μ, σ=1)

[sorry for such a stupid example]
Now even if I apply the transform to “o” myself, the answer would still be wrong if μ is not transformed.

I don’t know if the value that Theano transmits from μ to o is going to be the transformed value, or the raw value (in this case presumably a sample from an un-truncated Normal). Which is it?

If theano transforms variables like μ, then this can be worked around relatively easily. But if it does not, then the results of sampling could be not just a little wrong, but arbitrarily wrong.