No sampling when transforming random variables with theano using dot product

Yeah those are all valid things wrong with the model as posted, but that’s because I tried take it down to something that was two lines line and still failed in the same way. The linked valid latent space model also fails in the same way, you can replace the Exponential with a Normal in both creating the data and the model and it fails in the same way, and you can have all the same problems as this one but have the outcome not be the result of a dot product and it will at least sample. It’s not that the above gives the wrong thing, it’s that it makes no guesses at all for z.

So for example, another model that fails:

z_train = np.random.normal(7.2,3,(100,3))
x_train = np.dot(z_train,z_train.T)
#Model
with pm.Model() as simple_model:
    z = pm.Normal('z',mu=0,sd=100,shape=(100,3))
    x= pm.Normal('x',TT.dot(z,TT.transpose(z)),observed=x_train)

And so does the Edward example:

N = 30
K = 3
z = TT._shared(np.random.normal(5,1,[N,K]))
xp = TT.sum(TT.pow(z,2),axis=1,keepdims=True,)
xp = tt.tensor.tile(xp,[1,N])
xp = xp+TT.transpose(xp) - 2* TT.dot(z,TT.transpose(z))
xp = 1.0/ TT.sqrt(xp + TT.eye(N,N)*100000)
lam = xp.eval()
x_train = np.random.poisson(lam,1)
with pm.Model() as elegans_net:
    z = pm.Normal('z',mu=0,sd=200,shape=(N,K))
    xp = TT.sum(TT.pow(z,2),axis=1,keepdims=True,)
    xp = tt.tensor.tile(xp,[1,N])
    xp = xp+TT.transpose(xp) - 2* TT.dot(z,TT.transpose(z))
    xp = 1.0/ TT.sqrt(xp + TT.eye(N,N)*1000000)
    x = pm.Poisson('x',mu=xp,observed=x_train)