There does not seem to be. I guess pm.Bound can be applied more generally to different distributions. It made me wonder if there is any advantage to use one over the other, such as performance.
The truncated normal has an additional normalization term so that it is a proper pdf and integrates to 1, the bounded does not.
This usually does not matter if the variables are unobserved and have no other unobserved ancestors, but can matter otherwise. That’s why you can use TruncatedNormal as a likelihood but not Bounded.
Also for this reason PyMC (v4) does not allow you to do prior or posterior predictive with bounded variables. In general you should use Truncated and we will likely deprecate Bound soon.
There is one tiny advantage to Bound in that when you don’t need the extra normalization term you don’t have to compute it, but most users are not aware of when this is the case so we prefer to not mention it in the future.
Note that you can obtain the same effect of Bound by passing an IntervalTransform to continuous distributions
from pymc.distributions.transforms import Interval
# These are equivalent
pm.Normal("x", transform=Interval(-1, None)
pm.Bound("y", pm.Normal.dist(), lower=-1)