I don’t see any custom gradients, so everything in your model should just be auto-diff. It won’t be a gradient issue unless one of the component operations has a bug. But if you’re worried about it, there’s a tool pytensor.gradient.verify_grad that will check gradients of computations against a finite difference approximation.
Have you sampled from your priors? Do the results look scientifically plausible? For complex models, using the priors to control where the sampler will wander becomes increasingly important.