Behrens' Bayesian Learner Model : How to share parameters between steps?

I meet a new problem, the r.append(pm.Beta('r')) will generate so many RVs and assert an error : “fatal error: bracket nesting level exceeded maximum of 256” :cry:

Here is the full report:

Exception: ('Compilation failed (return status=1): ~/.theano/compiledir_macOS-10.16-x86_64-i386-64bit-i386-3.8.10-64/tmp_kphrv89/mod.cpp:30692:32: fatal error: bracket nesting level exceeded maximum of 256.         if (!PyErr_Occurred()) {.                                ^. ~/.theano/compiledir_macOS-10.16-x86_64-i386-64bit-i386-3.8.10-64/tmp_kphrv89/mod.cpp:30692:32: note: use -fbracket-depth=N to increase maximum nesting level. 1 error generated.. ', "FunctionGraph(MakeVector{dtype='float64'}(r0, r1, r2, ...... , r678, r679))")

I also set theano.config.gcc.cxxflags = "-fbracket-depth=2048" and sys.setrecursionlimit(10**6). Then Python3 raise me an error.

maximum recursion depth exceeded while calling a Python object

So I think the for loop way may not a good way to archive my model.

I tried to use theano.scan() instead of for loop to solve this problem. Here is my code:

p = np.repeat([.75, .75, .75, .25, .75, .25], 18)
env = stats.binom.rvs(1, p)
choose = list(map(bool, env))

observed_data = pd.DataFrame({"choose": choose, 
                              "index" : range(len(env))})

import theano.tensor as T

with pm.Model() as bayesian_lerner_model:
    k  = pm.Normal("k", mu = 1, sigma = 0.5, testval = 0.6)
    k_ = pm.Deterministic('k_cap', pm.math.exp(k))
    v  = pm.GaussianRandomWalk("v", mu = 0.07, sigma = k_, testval = 0.05, shape = len(env))
    v_ = pm.Deterministic('v_cap', pm.math.exp(v))

    
    r0 = pm.Beta('r0', mu = 1, sigma = 1)
    step = T.as_tensor_variable(np.array(range(len(env))))
    def update_r(v_, step, r0):
        w = r0
        k = 1 / v_
        r1 = pm.Beta('r'+str(step), alpha=w*(k-2) + 1, beta=(1-w)*(k-2) + 1, testval = 0.5)
        return r1
    
    result, updates = theano.scan(fn = update_r,
                                  sequences = [v_,step],
                                  outputs_info = r0)

    y = pm.Bernoulli("y", p = result, observed = env)
    
    trace = pm.sample(init="adapt_diag")

Unfortunately, I still don’t understand how to use theano.scan() to create RVs. So there is some error.

MissingInputError                         Traceback (most recent call last)
<ipython-input-24-7137d10b5bb4> in <module>
     29     y = pm.Bernoulli("y", p = result, observed = env)
     30 
---> 31     trace = pm.sample(init="adapt_diag")

...

MissingInputError: Input 1 of the graph (indices start from 0), used to compute Elemwise{true_div,no_inplace}(TensorConstant{1}, v_cap[t]), was not provided and not given a value. Use the Theano flag exception_verbosity='high', for more information on this error.