I have a classic, generic linear model with number of parameters that are defined by set of priors and later inferred. However, in some of the instances, the parameters have known uncertainty distribution and do not need to be inferred.
All I want in this case is to assign the appropriate (known) distribution to them in order to evaluate the model (together with remaining FreeRVs). The code below works fine if all the model parameters are to be inferred. I am not sure how to implement and package into Theano tensor a known distribution that will not be inferred/calibrated later. I’ve tried number of options but just can’t get it right. Any suggestions will be much appreciated.
with pm.Model(): # model specifications in PyMC3 are wrapped in a with-statement
# Define priors for the unknown parameters
priors = []
for i, pri in enumerate(pri_inputs):
if pri[0] == 1:
priors.append(
pm.Uniform("priors_{}".format(i), lower=pri[1], upper=pri[2])
)
elif pri[0] == 2:
priors.append(
pm.Deterministic("priors_{}".format(i), tt.constant(pri[1]))
)
elif pri[0] == 3:
bounded_N = pm.Bound(pm.Normal, lower=pri[3], upper=pri[4])
priors.append(bounded_N("priors_{}".format(i), mu=pri[1], sigma=pri[2]))
elif pri[0] == 4:
bounded_LogN = pm.Bound(pm.Lognormal, lower=pri[3], upper=pri[4])
priors.append(
bounded_LogN("priors_{}".format(i), mu=pri[1], sigma=pri[2])
)
priors = tt.stack(priors)[:, None]