Custom PyTensor Op for Root-Finding with Multiple Inputs and Outputs

Hi @jessegrabowski, I agree that it would be much better to have an analytical version to estimate the roots and not bother with a custom Op at all.
The parameters are typically: thickness ~ 10^{-5}, D between 10^{-4} and 1, Sb and Sf between 1 and 10^{4}. This makes the ‘two-regime’ estimation a bit more complicated, because the vertical asymptote that defines the two can now move with D as well (as your nice Desmos implementation also shows).

So instead, I updated the partial derivatives to match with the ‘new’ squared version of my function (something, which I should have done before I guess) and now pytensor.gradient.verify_grad runs, if the absolute error is smaller than 10^{-10}.
The PyTensor documentation of the grad() method states that the output must be \frac{\partial\alpha}{\partial x} = \frac{\partial\alpha}{\partial g(x)} * \frac{\partial g(x)}{\partial x}, but as we are now at a minimum, \frac{\partial g(x)}{\partial x} should be 0, so the matrix product should also be 0. This would be in line with the val2 of pytensor.gradient.verify_grad. Or am I getting something completely wrong here?