Deleting / replacing RVs

Is it possible to replace a RV after it has been created the usual way? e.g. var = pm.Beta('var', 1, 1)

Suppose later on, I wish to redefine var = pm.Normal('var', 0, 1)

Currently, doing so would invoke an error:

>>>ValueError: Variable name var already exists.

I can’t seem to see how to delete RV from PyMC3 either.

Is this currently possible in PyMC3? I suspect there may be an issue in implementing this sort of functionality due to the way in which PyMC3 deals with RVs, automatically building it into pm.Model().

Thoughts?

I think you can use theano.clone to modify the underly theano computational graph, but I dont know exactly how you can to do that as well.

@junpenglao @lucianopaz I’m interested in doing what @cvigoe described. Do we now know whether or not this is possible with theano.clone? If so, do you know of an example illustrating this or can you sketch the syntax?

@npschafer, no, it is not possible at the moment. You can clone a theano computational graph and replace a node in it with another one. The problem is that a pymc3 Model has a computational graph that only represents the calculations underlying the logp, but the distributions are almost completely disconnected from this graph after each RV is created. This means that even if you managed to change the models logp using clone, this would only have an effect when you try to do inference (sample), but it wouldn’t do anything when you do forward sampling (sample_prior_predictive or sample_posterior_predictive).

This will change once we move pymc random variables to use Random variable operators (this work was introduced in symbolic pymc and it is being refined in theano-pymc itself). Once we are able to do that, the pymc model will have the full computational graph that will be responsible for inference and forward sampling, so it will be possible to clone and replace nodes in it, as you would with any other theano graph. In fact, symbolic pymc showed that this could be used to implement automatic model reparametrizations that had a more robust sampling behavior.

1 Like

Thanks for the thorough explanation, @lucianopaz. Much appreciated!