Passing multiple observed parameters to CustomDist

Hi,

I’m following this tutorial (Fitting a Reinforcement Learning Model to Behavioral Data with PyMC — PyMC example gallery)
and I am trying to pass two matrices as observed parameters to a CustomDist logp function.
This is my code

def logp(learning_rate, inv_temp, choices, outcomes):
    choices_ = pt.as_tensor_variable(choices, dtype='int32')
    outcomes_ = pt.as_tensor_variable(outcomes, dtype='int32')

    beliefs = 0.5 * pt.ones((4,), dtype='float64')
    choice_probs_ = 0.5 * pt.ones((1,), dtype='float64')

    [beliefs, choice_probs], updates = scan(
        fn=update_belief, 
        sequences=[choices_, outcomes_],
        non_sequences=[learning_rate, inv_temp],
        outputs_info=[beliefs, choice_probs_]
    )

    ll = np.sum(np.log(choice_probs))
    return ll

with pm.Model as m:
    # n_trial x 1
    choices_ = pm.ConstantData('choices_', choices)
    # n_trial x 4
    outcomes_ = pm.ConstantData('outcomes_', outcomes)

    alpha = pm.Beta(name="alpha", alpha=1, beta=1)
    beta = pm.HalfNormal(name="beta", sigma=10)

    like = pm.CustomDist('like', alpha, beta, logp=logp, observed={'choices':choices_, 'outcomes':outcomes_})

and I get the following error

Blockquote TypeError: Since v4.0.0 the observed parameter should be of type pd.Series, np.array, or pm.Data. Previous versions allowed passing distribution parameters as a dictionary in observed, in the current version these parameters are positional arguments.

Is it possible in v>4.0 of PyMC to pass multiple observed parameters to my logp function?
I can’t find any recent example of it.

Thanks :slight_smile:
Filippo

You cannot. You can pass the other variable as a constant parameter, or you can stack the two variables in a matrix (if they have compatible shapes) and separate them in the logp function.

Also note that the observed variable is the first argument passed to the logp function. It’s all position based, not name based.

Thanks!
Stacking the variables into a single matrix worked great.

Filippo