Hi all,
I’ve got a problem that I’m not quite 100% sure on how to approach (tl;dr at the bottom). The basic thing I want to do is that I have a vector of priors ‘b’ and I want to combine it with a Voltage matrix ‘Volt’ in a deterministic node so that I get the following:
so the first b multiplies across the first row, the second b in the vector multiplies across the second row and so forth. Now doing this in numpy with dummy numbers, I was able to get it done via:
res=np.transpose(np.einsum('ij,ki->jki',V,b))
where b was an array and V was a matrix.
But the closest parallel for einstein summation in pytensor that I could find would be either batched_tensordot/batched_dot (which on the website says is a subset of einstein summation), but I’m unsure on if it would work since I’m still a bit shaky regarding the dimensionality of pymc distributions and combining with pytensor. So the other alternative that i’ve seen is using pt.scan and pt.subtensor.set_subtensor to just set the values of the resulting matrix which I believe is probably moreso the way to go but since I am uncertain I figured I’d ask in case I’m being foolish!
tl;dr For the problem of multiplying across a vector of priors so that the i’th entry multiplies across the i’th row of a matrix, is pt.scan and pt.set_subtensor the way to go for this problem, or can it be done more succinctly?
Am happy to provide further context if necessary, thanks in advance!