I don’t like responding to my own messages but I think, I solved the problem described above. For those who may come across similar issue it is worth taking a look at @junpenglao post here. I also struggled a bit with theano and shape parameter. There is fantastic description by @lucianopaz here. Finally the code that works for me is as follows:
with pm.Model():
priors = []
for i, pri in enumerate(pri_inputs):
if pri[0] == 1:
priors.append(pm.Uniform("priors_{}".format(i), lower=pri[1], upper=pri[2]))
elif pri[0] == 2:
priors.append(pm.TruncatedNormal("priors_{}".format(i), mu=pri[1], sigma=pri[2], lower=pri[3], upper=pri[4]))
priors = tt.stack(priors)[:, None]
# Define the surrogate models for the outputs using the priors defined above
linear = sm_linear * priors
rsm = tt.sum(sm_rsm * priors * priors.T, axis=2)
out_mu = sm_intercept + tt.sum(linear, axis=0) + tt.sum(rsm, axis=1)
# Define loglikelihood as Multivariate Normal with defined covariance matrix and observations
loglikelihood = pm.MvNormal("loglikelihood", mu=out_mu, cov=true_cov, observed=measured)
# Inference
step = pm.NUTS() # using NUTS sampling
trace = pm.sample(draws=samples, step=step, cores=2, progressbar=True)