Thanks @ricardoV94 ! I tried your size=1000 dists approach first and the model graphviz is successfully showing arrows connecting my RVs to the likelihood variable, but I got the following error – is this PyMC complaining that my logp is stochastic as you expected?
While trying logp_fn = model.compile_logp()
:
/opt/anaconda3/envs/pymc/lib/python3.11/site-packages/pymc/logprob/joint_logprob.py:190: UserWarning: Found a random variable that was neither among the observations nor the conditioned variables: [multivariate_normal_rv{1, (1, 2), floatX, False}.0, multivariate_normal_rv{1, (1, 2), floatX, False}.out]
warnings.warn(
/opt/anaconda3/envs/pymc/lib/python3.11/site-packages/pymc/logprob/joint_logprob.py:190: UserWarning: Found a random variable that was neither among the observations nor the conditioned variables: [uniform_rv{0, (0, 0), floatX, False}.0, uniform_rv{0, (0, 0), floatX, False}.out]
warnings.warn(
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[45], line 1
----> 1 logp_fn = model.compile_logp()
File /opt/anaconda3/envs/pymc/lib/python3.11/site-packages/pymc/model.py:671, in Model.compile_logp(self, vars, jacobian, sum)
652 def compile_logp(
653 self,
654 vars: Optional[Union[Variable, Sequence[Variable]]] = None,
655 jacobian: bool = True,
656 sum: bool = True,
657 ) -> PointFunc:
658 """Compiled log probability density function.
659
660 Parameters
(...)
669 Defaults to True.
670 """
--> 671 return self.model.compile_fn(self.logp(vars=vars, jacobian=jacobian, sum=sum))
File /opt/anaconda3/envs/pymc/lib/python3.11/site-packages/pymc/model.py:759, in Model.logp(self, vars, jacobian, sum)
757 rv_logps: List[TensorVariable] = []
758 if rvs:
--> 759 rv_logps = joint_logp(
760 rvs=rvs,
761 rvs_to_values=self.rvs_to_values,
762 rvs_to_transforms=self.rvs_to_transforms,
763 jacobian=jacobian,
764 )
765 assert isinstance(rv_logps, list)
767 # Replace random variables by their value variables in potential terms
File /opt/anaconda3/envs/pymc/lib/python3.11/site-packages/pymc/logprob/joint_logprob.py:308, in joint_logp(rvs, rvs_to_values, rvs_to_transforms, jacobian, **kwargs)
305 value_var = rvs_to_values[rv]
306 logp_terms[value_var] = temp_logp_terms[value_var]
--> 308 _check_no_rvs(list(logp_terms.values()))
309 return list(logp_terms.values())
File /opt/anaconda3/envs/pymc/lib/python3.11/site-packages/pymc/logprob/joint_logprob.py:265, in _check_no_rvs(logp_terms)
255 unexpected_rv_nodes = [
256 node
257 for node in pytensor.graph.ancestors(logp_terms)
(...)
262 )
263 ]
264 if unexpected_rv_nodes:
--> 265 raise ValueError(
266 f"Random variables detected in the logp graph: {unexpected_rv_nodes}.\n"
267 "This can happen when DensityDist logp or Interval transform functions "
268 "reference nonlocal variables."
269 )
ValueError: Random variables detected in the logp graph: [uniform_rv{0, (0, 0), floatX, False}.out, multivariate_normal_rv{1, (1, 2), floatX, False}.out, uniform_rv{0, (0, 0), floatX, False}.out].
This can happen when DensityDist logp or Interval transform functions reference nonlocal variables.
I am not familiar with BlockedStep – why is it that much more complicated machinery necessary? I noticed that wrapping the two pm.draw or .dists(size=1000) inside pm.Deterministic didn’t work either – is there a way to make pm.Deterministic work? In fact it might be nice to have these sets of 1000 “nuisance” parameters saved in the final trace (but only if it is not expensive).