Hi,

I’d like to broadcast dimensions across aesara tensors.

As an example, I can do this with numpy inputs as follows:

```
r, c = at.row(), at.col()
f_subtract = aesara.function([c,r], [r-c])
c_ = np.array([1,2]).reshape(2,1)
r_ = np.array([2,3,4]).reshape(1,3)
f_subtract(c_,r_)
```

which gives:

```
[array([[1., 2., 3.],
[0., 1., 2.]])]
```

However, my inputs are tensors, not numpy arrays. When I use tensors as inputs I get the error:

“Expected an array-like object, but found a Variable: maybe you are trying to call a function on a (possibly shared) variable instead of a numeric array?”.

Surely this is possible. My only other option is to loop through each tensor component and then stack the outputs.

Any pointers would be greatly helpful.

Many thanks,

Harry