PyTorch backend for PyMC4

Basing a Bayesian framework on a dynamic computational graph creates one major challenge, the ability to have a clean user interface to express the models in. But to counter this one gets the inherent flexibility and ease of use of a dynamic computational graph.

It becomes orders of magnitude easier to implement complicated algorithms. It took me less than an hour to implement nuts from scratch in pytorch. While I still have not seen any NUTS implementations in TF, despite two existing bayesian frameworks is in TF already. This certainly extends to more cutting-edge VI applications as well as just more flexible bayesian models.

One added benefit of pytorch is that they are aiming to support an interface where pytorch is a clean drop-in replacement for numpy i.e. replace import numpy as np with import torch as np and it should just work. Pytorch is not alone in having numpy as guideline for their interface. This means that it would be possible to support multiple backends as long as their syntax is close enough to numpy/pytorch i.e. support pytorch, chainer, mxnet(gluon) as different backends.

3 Likes