Ugh.
What about a model factory approach? A model factory function could create a backwards version of the model, with a binomial tensor tied to an observation, and also a forwards version of the model, with a deterministic that uses a theano binomial to simulate:
if direction == 'backward':
won = pm.binomial('won', n=foo, p=bar, observed=observed_won)
elif direction == 'forward':
won = deterministic('won', self._rng.binomial(n=foo, p=bar))
else:
raise Error(f'Bad direction: {direction}')
Then the approach you explained elsewhere could be employed:
with backward_model:
df = pm.trace_to_dataframe(
forward_trace, varnames=[...], include_transformed=True)
ppc = pm.sample_posterior_predictive(
trace=df.to_dict('won'), samples=len(df))
Will this approach work?