Assume that you have two priors
N=100
p = pm.Dirichlet("p", np.ones(N))
log_r = pm.Uniform("log_r", np.ones(N))
Is there a way of achieving the constraint (p * pm.math.exp(log_r)).sum() =1 apart from introducing a tight likelihood around 1? I know there are some transformations like SumsTo1 I dont really know how to use them and I dont know if they extend to weighted sums (where weights are RVs).
Ofcourse I can just always normalize p * pm.math.exp(log_r) to use it downstream and add another likelihood that constraints the sum of log_r or something like letting the product q = p*r be a Dirichlet RV to be fitted and then sampling r by computing q/p. But I was wondering if there is a more elegant, built-in way of doing this rather than introducing extra priors.