# Inferring distribution of function of random variables

I’m trying to understand whether pymc3 is able to perform forward in addition to backwards reasoning through its inference engine. I’ll explain with a very simple example.
Imagine the following model; two coins independently distributed with Bernoulli distribution and a bias of `0.5` for both of them. Now create a random variable which is true if both the two coins are heads.
I.e.

x ~ Bernoulli(0.5)
y ~ Bernoulli(0.5)
z = x&y

So `z` is true when both `x` and `y` are `True`, and `False` otherwise.
My questions are:

1. Does pymc3 support inference of the distribution of the random variable `z` (i.e. should become Bern( 0.25)).
2. I could now set `z` to be observed to `True`. I’d then like to infer the new distribution for `x`, say (i.e. Bern(0.33)).
3. Set `x` to be observed to `True`. I’d then like to infer the distribution for `z` (i.e. Bern(0.5)).

I realise that due to the implementation of the inference via MCMC this might not be possible, especially as the first one involves no likelihood which could easily be done by belief propagation but I imagine not so in MCMC.

Thanks!

Update:
Using `z = Deterministic(x&y)` I seem to be able to get the expected results for points 1 and 3.

My confusion with point 2 persists.

Regarding Question 1., samples can be obtained for `z`, of course, but no actual connection with the derived (closed-form) distribution is made. The symbolic computation machinery for that is introduced in our research project `symbolic-pymc`.

Likewise, the other questions, if I’m understanding them correctly, could be implemented with the aforementioned functionality.

Thanks for this, looks like an interesting avenue - I’ll check it out in more detail