# Einsum in pytensor

Hi all,

I’ve got a problem that I’m not quite 100% sure on how to approach (tl;dr at the bottom). The basic thing I want to do is that I have a vector of priors ‘b’ and I want to combine it with a Voltage matrix ‘Volt’ in a deterministic node so that I get the following:

so the first b multiplies across the first row, the second b in the vector multiplies across the second row and so forth. Now doing this in numpy with dummy numbers, I was able to get it done via:

``````res=np.transpose(np.einsum('ij,ki->jki',V,b))
``````

where b was an array and V was a matrix.

But the closest parallel for einstein summation in pytensor that I could find would be either batched_tensordot/batched_dot (which on the website says is a subset of einstein summation), but I’m unsure on if it would work since I’m still a bit shaky regarding the dimensionality of pymc distributions and combining with pytensor. So the other alternative that i’ve seen is using pt.scan and pt.subtensor.set_subtensor to just set the values of the resulting matrix which I believe is probably moreso the way to go but since I am uncertain I figured I’d ask in case I’m being foolish!

tl;dr For the problem of multiplying across a vector of priors so that the i’th entry multiplies across the i’th row of a matrix, is pt.scan and pt.set_subtensor the way to go for this problem, or can it be done more succinctly?

Am happy to provide further context if necessary, thanks in advance!

If I understand what you are asking for correctly then there is a simpler way to achieve it:

``````with pm.Model():

b = pm.Normal("b", mu=0, size=5)
V = np.random.normal(0, 1, size=(5,5))

bV = pm.Deterministic("bV", b[:,None]*V)
``````

This simply expands b to a matrix which has identical columns and then does the multiplication. This would also work with numpy though I am not sure why you used einstein summation (which would produce a 5x5x5 matrix in this case?).

2 Likes

Yep, that works perfectly. Many thanks, you just saved me a whole lot of hassle barking up the wrong tree