Use GP latent variable as input to another function, shape mismatch

Thanks for pointing this out! There seems to be a bug in the default mean_func used: Zero. It allocates zeros only using the first dimension of X.shape instead of the full shape. That’s why you end up with a (100, 100) instead of a (100, 1) shaped array. Could you post this as an issue in GitHub so we can reference it for a fix? In the end, using column vectors and then broadcasting on the last axis may also be affected by this issue, which we still haven’t fixed.

For now, could you try supplying the mean_func=tt.zeros_like and see if that fixes your problem?