Broadcast dimensions across aesara/theano tensors

Hi,

I’d like to broadcast dimensions across aesara tensors.
As an example, I can do this with numpy inputs as follows:

r, c = at.row(), at.col()
f_subtract = aesara.function([c,r], [r-c])

c_ = np.array([1,2]).reshape(2,1)
r_ = np.array([2,3,4]).reshape(1,3)

f_subtract(c_,r_)

which gives:

[array([[1., 2., 3.],
        [0., 1., 2.]])]

However, my inputs are tensors, not numpy arrays. When I use tensors as inputs I get the error:
“Expected an array-like object, but found a Variable: maybe you are trying to call a function on a (possibly shared) variable instead of a numeric array?”.

Surely this is possible. My only other option is to loop through each tensor component and then stack the outputs.

Any pointers would be greatly helpful.
Many thanks,
Harry

You don’t need to ever compile an aesara function to do computation with symbolic tensors. Your code already contains the answer to your question: it’s just r-c.

1 Like

AHH thanks!

I realised I hadn’t reshaped my tensors when working with them so it was giving my shape errors. You’re right, once I’ve reshaped them then everything works as expected!

2 Likes