Transformed variables causing errors with NUTS but not with SVGD

@junpenglao Thanks for the suggestion but it did not work I am still getting the same error. Below is the modified code, is that the right way to cast alpha and beta as theano tensors? Also I didn’t quite understand the difference between forward and forward_val functions in the transformation class. Couldn’t find much documentation about it, can you please explain or point me to some docs. Thanks!

class Linear(pm.distributions.transforms.ElemwiseTransform):
    name = "linear"

    def __init__(self, alpha, beta):
        self.alpha = tt.as_tensor_variable(alpha)
        self.beta = tt.as_tensor_variable(beta)

    def forward(self, x):
        return self.alpha * x + self.beta

    def forward_val(self, x, point=None):
        return self.alpha * x + self.beta

    def backward(self, x):
        return (1 / self.alpha) * (x - self.beta)

    def jacobian_det(self, x):
        return -tt.log(self.alpha)