Constrained (Ordered) Random Variable

What should be the correct implementation for declaring two random variables such that x < y ?

My equation is somewhat like this -

\mu + \gamma_a * (X_a - T )^+ + \gamma_b * (T - X_b)^+

where the ()^+ indicates the value is zero if negative and ensures either of a terms or b terms are selected. Here X_a, X_b both are uniform distributions such that X_a < X_b.

I used the Sampling uniformly in a triangular support example to impose the constraint and my code is as follows –

with pm.Model() as model:
    
    boundedNormal = pm.Bound(pm.Normal, lower= 0)
    
    
    Ebase = boundedNormal('Eb', mu = 20 , sd = 20 )
    γ_heat = boundedNormal('γ_h', mu = 0, sd = 4)
    γ_cool = boundedNormal('γ_c', mu = 0, sd = 4)
     
    
    the_transform = Composed(pm.distributions.transforms.Interval(32,100), Ordered())
    T_setpoint = pm.Uniform('x',
                     lower = 32,
                     upper = 100,
                     shape = 2,
                     transform=the_transform,
                     testval=[68, 85])
    
    T_heat = T_setpoint[0]
    T_cool = T_setpoint[1]
    σ = pm.HalfCauchy('σ', 10)
    
    E_heat = pm.math.switch( T_heat  >  Td,  γ_heat * (T_heat - Td ), 0)
    E_cool = pm.math.switch( Td  >  T_cool,  γ_cool * (Td - T_cool), 0)
    
    E = pm.Normal('E', mu = Ebase + E_heat + E_cool, sd = σ, observed = E_tot)

Your model looks right to me.

The problem with this section seems to be causing bad initial energy. I have to manually set the testval for different datasets to make it work.

I want an uninformed prior such that

T_heat = pm.Uniform (‘Th’, lower = 32, upper = 100)
T_cool = pm.Uniform(‘Tc’, lower = T_heat, upper = 100)

You definitely need to set a test value when using ordered transformation, it is a current limitation. The second way you want to model is not necessarily uninformative, actually there will be bias if you sample like that, you can try simulating it in scipy.