Weird random number generation pattern after training a pymc3 model

I was following the bayesian weibull tutorial given here Bayesian Parametric Survival Analysis with PyMC3 — PyMC3 3.11.4 documentation, however, in my application, I will need to wrap the whole training code in a custom python function. I also need to generate some random points in the same function before training the model :-

def gumbel_sf(y, μ, σ):
    return 1.0 - tt.exp(-tt.exp(-(y - μ) / σ))

def generate_dist():
    return np.random.choice([0, 1], size = 500)

def tutorial_model(X, y_std, cens):
    dist = generate_dist()
    VAGUE_PRIOR_SD = 5.0
    with pm.Model() as weibull_model:
        β = pm.Normal("β", 0.0, VAGUE_PRIOR_SD, shape=2)
    X_ = shared(X)

    with weibull_model:
        η = β.dot(X_.T)

    with weibull_model:
        s = pm.HalfNormal("s", 5.0)

    y = np.log(df.time.values)
#     y_std = (y - y.mean()) / y.std()

#     cens = df.event.values == 0.0

    cens_ = shared(cens)

    with weibull_model:
        y_obs = pm.Gumbel("y_obs", η[~cens_], s, observed=y_std[~cens])

    with weibull_model:
        y_cens = pm.Potential("y_cens", gumbel_sf(y_std[cens], η[cens_], s))

    SEED = 845199  # from, for reproducibility

    SAMPLE_KWARGS = {"chains": 3, "tune": 100, "random_seed": [SEED, SEED + 1, SEED + 2]}

    with weibull_model:
        weibull_trace = pm.sample(**SAMPLE_KWARGS)
    return weibull_model, weibull_trace, dist

The problem is, the randomly generated array, dist, after two function calls, starts getting the exact same values as in the second function call. So for example, if I call the function five times, for the first two calls the dist values are random, but starting from the third call till the fifth call, the dist array remains exactly the same as in the second call. What is more weird is, this happens only when pymc3 model is trained to generate the trace. I experimented this by commenting out all the pymc3 related code just to make sure I am not messing around with the numpy random function.
I have been stuck with this problem for almost two weeks now, any help will be appreciated :pray:

The root of the issue is the global seeding of the legacy numpy random generator. PyMC internally calls np.random.seed within sample, to initialize the nuts sampler, within the multiple chains… and that affects all posterior calls to np.random.<distribution> like np.random.choice that you use to generate dist. This is why commenting the pymc sampling code “gets rid of the issue”.

Using np.random.seed and np.random.<distribution> is legacy numpy code and you should use the new generators, see Random sampling (numpy.random) — NumPy v1.22 Manual.

Seeding will work differently in v4 (currently in beta and IIRC without properly working seeding for now) but pymc3 3.x has already been frozen and will not be updated to stop using the legacy random generators.

1 Like

Hi @OriolAbril
Apologies for late reply. I tried the new generators as you mentioned, and thankfully it worked. Now I will have to change all the numpy legacy generators in the source code of the original libraries that are using some form of random sampling :frowning:
Also, I would definitely suggest to update pymc3 documentation to mention this bug/weired behaviour. This would definitely be useful for other researchers working with this library.
Again, I cant thank you enough.

1 Like

Feel free to send a PR to v3 with the note/warning. It can be weird and unexpected but np.random.<distribution> relies on a global state that can be seeded. It is not something specific to PyMC and while I agree updating the docs would be useful, v4 will be arguably more useful and we are quite streched. We don’t have enough bandwidth to still maintain v3 docs.

Sure, will do whenever I get some time. However, you mentioned that v4 will be more useful. As far as this thread is concerned Is PyMC4 dead or just resting? - #4 by ericmjl, it says development of Pymc4 is officially dead.

v4 is not PyMC4. PyMC4 was an experiment to see how would using TensorFlow as a backend look like. By v4 I mean PyMC 4.0 (currently in beta release) which maintains most of the PyMC3 API and is built on top of Aesara (evolution of Theano) that can compile the PyMC models to C (like Theano), but also to JAX or Numba.

For more context see The Future of PyMC3, or: Theano is Dead, Long Live Theano | by PyMC Developers | Medium (link at the end of the thread you mention).

1 Like