In general, these kind of question is formulated as a GLM with Bernoulli likelihood:
with pm.Model() as m:
...
obs = pm.Bernoulli('observed', p, observed=w)
Now the trick is to come up with the formulation of p. Rearrange your initial formulation a bit, we get:
p = z > y
=> z - y > 0
=> (a_guess - a) + (b_guess - b) * x > 0
A direct translation would be:
with pm.Model() as model:
a = pm.Uniform('a', lower=-10, upper=10)
b = pm.Uniform('b', lower=-10, upper=10)
p = (a_guess - a) + (b_guess - b) * x > 0
obs = pm.Bernoulli('observed', p, observed=w)
of course, a binary p will likely makes inference pretty difficult, instead, we usually use a Sigmoid:
with pm.Model() as model:
a = pm.Uniform('a', lower=-10, upper=10)
b = pm.Uniform('b', lower=-10, upper=10)
latent_p = (a_guess - a) + (b_guess - b) * x
obs = pm.Bernoulli('observed', pm.math.sigmoid(latent_p), observed=w)