Possible bug: `model.debug(fn='logp')` doesn't work with Potentials

I thought I’d give the new(ish) model.debug(fn='logp') a try on a particularly fiendish problem I have elsewhere Something changed in `pytensor > 2.12.3` (and thus `pymc > 5.6.1`) that makes my `pytensor.gradient.grad` call get stuck - any ideas? - #31 by jonsedar

However, the debug() call errors out when I have models with a Potential:

ValueError: Length of {name_of_potential} cannot be determined

I dont see this problem mentioned here or in the Issue tracker, so hope you guys can opine.

Hopefully this MRE will help to describe:

import matplotlib.pyplot as plt
import numpy as np
import pymc as pm

rng = np.random.default_rng(42)
n = 100
y_obs = rng.normal(size=n)

with pm.Model() as mdl:
    y = pm.MutableData('y', y_obs)
    mu = pm.Normal('mu', mu=0.0, sigma=1.0)
    sigma = pm.InverseGamma('sigma', alpha=5.0, beta=4.0)
    norm_d = pm.Normal.dist(mu=mu, sigma=sigma, size=n)
    _ = pm.Potential('pot_yhat', pm.logp(norm_d, y))

mdl.debug(fn='random', verbose=True)

mdl.debug(fn='logp', verbose=True)

Also see this gist: 800_issue_potential_debug · GitHub

There seems to be something going on in https://github.com/pymc-devs/pytensor/blob/7bb18f3a3590d47132245b7868b3a4a6587a4667/pytensor/tensor/__init__.py#L59 which fails for Potentials. Possibly because this one has shape None ?

$> TensorType(float64, shape=(None,))

Feel free to open a GitHub issue even though debug might not help much with Potentials

Thanks, opened here: BUG: `model.debug(fn=‘logp’)` doesn’t work with Potentials · Issue #6966 · pymc-devs/pymc · GitHub