Yeah that part is not explained very well in the doc, that’s why I opened the issue there…
In brief, when Compound steps are involved, it takes a list of step to generate a list of methods. So for example if you do
with pm.Model() as m:
rv1 = ...
...
step1 = pm.Metropolis([rv1, rv2])
step2 = pm.CategoricalGibbsMetropolis([rv3])
trace = pm.sample(..., step=[step1, step2]...)
The Compound step is now contain a list of methods:
And at each sample, it iterates each method, which takes a point as input, and generates a new point as output. The new point is proposed within each step via a stochastic kernel, and if the proposal was rejected by MH criteria it just outputs the original input point
Take a simple example:
n_=theano.shared(np.asarray([10, 15]))
with pm.Model() as m:
p = pm.Beta('p', 1., 1.)
ni = pm.Bernoulli('ni', .5)
k = pm.Binomial('k', p=p, n=n_[ni], observed=4)
Now specify the step:
with m:
step1 = pm.Metropolis([m.free_RVs[0]])
step2 = pm.BinaryGibbsMetropolis([ni])
And now you can pass a point to the step, and see what happens:
point = m.test_point
point
# {'ni': array(0), 'p_logodds__': array(0.)}
point, state = step1.step(point=point)
point, state
# ({'ni': array(0), 'p_logodds__': array(0.69089502)},
# [{'accept': 0.8832003265520174, 'tune': True}])
as you can see, the value of ni does not change, but p_logodds__ is updated.
And similarly, you can get a sample using the step2:
# (notice that there is no `generates_stats`, so only output the point here.)
point = step2.step(point=point)
point
# {'ni': array(0), 'p_logodds__': array(0.69089502)}
Compound step works exactly like this by iterating all the steps within the list. In effect, it is a metropolis hastings within gibbs sampling.