Composing models

I will be needing to generate a number of models, all variations on a theme. I can’t hand code them all due to the large number of possible combinations. So I’m asking about if it’s possible to compose models in PyMC?

It seems like it is, based on this issue in the repo #1553, but I couldn’t make much sense of it - probably because I’m still a beginner with Python. If it’s possible to compose models, I’d really appreciate a quick demo. Here’s a handy concrete example

``````import numpy as np
import pymc3 as pm
import math

A = np.array([28, 34, 19, 41, 27, 54, 15, 67, 20, 54, 22, 49, 24, 78, 11, 25, 54, 69, 47, 33, 40, 31, 34, 25, 80, 14, 55])
DA = np.array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0])
B = np.array([30, 35, 25, 75, 50, 55, 35, 75, 55, 60, 25, 60, 35, 80, 30, 60, 80, 85, 50, 80, 55, 85, 50, 30, 85, 25, 75])
DB = np.array([179, 186, 53, 20, 21, 117, 13, 119, 7, 111, 136, 89, 29, 162, 7, 14, 30, 91, 160, 14, 62, 7, 30, 80, 157, 19, 61])
R_obs = np.array([0, 0, 0, 1, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 1, 1, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0])

def cumulative_normal(x):
return 0.5 + 0.5 * pm.math.erf(x/pm.math.sqrt(2))

def hyperbolic(logk, reward, delay):
return reward / (1.0+pm.math.exp(logk)*delay)

def hyperboloid(logk, S, reward, delay):
return reward / (1.0+pm.math.exp(logk)*delay)**S

def choice_psychometric(α, ϵ, VA, VB):
return ϵ + (1.0-2.0*ϵ) * cumulative_normal((VB-VA)/α)

with pm.Model() as model_hyperbolic:
# discount function stuff ~~~~~~~~~~~~~~~~~~~~~~~~~~
# prior
logk = pm.Normal('logk', mu=math.log(1/50), sd=5)
# Value functions
VA = pm.Deterministic('VA', hyperbolic(logk, A, DA))
VB = pm.Deterministic('VB', hyperbolic(logk, B, DB))
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

alpha = pm.Exponential('alpha', lam=1)
epsilon = pm.Beta('epsilon', alpha=1.1, beta=10.9)
P_chooseB = pm.Deterministic('P_chooseB', choice_psychometric(alpha, epsilon, VA, VB))

# Likelihood of observations
R = pm.Bernoulli('R', p=P_chooseB, observed=R_obs)

with pm.Model() as model_hyperboloid:
# discount function stuff ~~~~~~~~~~~~~~~~~~~~~~~~~~
# prior
logk = pm.Normal('logk', mu=math.log(1/50), sd=5)
S = pm.Normal('S', 1, sd=0.001)
# Value functions
VA = pm.Deterministic('VA', hyperboloid(logk, S, A, DA))
VB = pm.Deterministic('VB', hyperboloid(logk, S, B, DB))
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

alpha = pm.Exponential('alpha', lam=1)
epsilon = pm.Beta('epsilon', alpha=1.1, beta=10.9)
P_chooseB = pm.Deterministic('P_chooseB', choice_psychometric(alpha, epsilon, VA, VB))

# Likelihood of observations
R = pm.Bernoulli('R', p=P_chooseB, observed=R_obs)
``````

So here I have just two models which use either `hyperbolic` or `hyperboloid` deterministic functions. This is just a minimal example, for the real task I have there will be many discount functions (each with their own priors), and there will also be plug and play link functions.

First I thought of just injecting the desired function, but then realised this won’t work as each is associated with different numbers (and names) of parameters.

PS. On a separate note, I really like being able to define functions externally from the model. That way I can reuse those functions if I want to do visualisation using prior or posterior samples

I think I’d write a function that builds the model, and some functions that add parameters to the model that are required by the link function. Something along the lines of this:

``````def add_hyperbolic_link(logk, A, DA):
VA = pm.Deterministic('VA', hyperbolic(logk, A, DA))
VB = pm.Deterministic('VB', hyperbolic(logk, B, DB))

alpha = pm.Exponential('alpha', lam=1)
epsilon = pm.Beta('epsilon', alpha=1.1, beta=10.9)
P_chooseB = pm.Deterministic('P_chooseB', choice_psychometric(alpha, epsilon, VA, VB))
return P_chooseB

S = pm.Normal('S', 1, sd=0.001)
VA = pm.Deterministic('VA', hyperboloid(logk, S, A, DA))
VB = pm.Deterministic('VB', hyperboloid(logk, S, B, DB))

alpha = pm.Exponential('alpha', lam=1)
epsilon = pm.Beta('epsilon', alpha=1.1, beta=10.9)
P_chooseB = pm.Deterministic('P_chooseB', choice_psychometric(alpha, epsilon, VA, VB))
return P_chooseB

with pm.Model() as model:
logk = pm.Normal('logk', mu=math.log(1/50), sd=5)
else:

R = pm.Bernoulli('R', p=P_chooseB, observed=R_obs)
return model

model = build_model(R_obs, A, DA, 'hyperbolic')
``````

On a sidenote: You might want to use `tt.nnet.sigmoid` in some of the functions, that might be more stable:

``````def hyperbolic(logk, reward, delay):
return reward * tt.nnet.sigmoid(-logk - tt.log(delay))
``````
1 Like

Thanks for that, very useful.

Getting some divergent samples (not fixed with your alternate `hyperbolic` function, so the issue is elsewhere). And having some issues with softmax link function with a temperature parameter. But these are separate issues. A plain softmax without a temperature parameter seems to work

``````def choice_softmax_temp(temperature, VA, VB):
VA = VA / temperature;
VB = VB / temperature;
p_choose_B = pm.math.exp(VB) / ((pm.math.exp(VA)) + (pm.math.exp(VB)))
return p_choose_B
``````

In terms of model composition, I’ve used your basic approach but changed it about a bit to make it a bit clearer (from where I’m coming from).

``````def build_model(R_obs, A, DA, B, DB, df='hyperbolic', link='psychometric'):
with pm.Model() as model:
if df == 'hyperbolic':
logk = pm.Normal('logk', mu=math.log(1/50), sd=5)
# Value functions
VA = pm.Deterministic('VA', hyperbolic(logk, A, DA))
VB = pm.Deterministic('VB', hyperbolic(logk, B, DB))
elif df == 'hyperboloid':
logk = pm.Normal('logk', mu=math.log(1/50), sd=5)
S = pm.Normal('S', 1, sd=0.01) # todo: need better prior here!
# Value functions
VA = pm.Deterministic('VA', hyperboloid(logk, S, A, DA))
VB = pm.Deterministic('VB', hyperboloid(logk, S, B, DB))

alpha = pm.Exponential('alpha', lam=1)
epsilon = pm.Beta('epsilon', alpha=1.1, beta=10.9)
P_chooseB = pm.Deterministic('P_chooseB', choice_psychometric(alpha, epsilon, VA, VB))
P_chooseB = pm.Deterministic('P_chooseB', choice_softmax_temp(1.0, VA, VB))

R = pm.Bernoulli('R', p=P_chooseB, observed=R_obs)
return model
``````

So now I can compose models, e.g.

``````model1 = build_model(R_obs, A, DA, B, DB, df='hyperbolic')
model2 = build_model(R_obs, A, DA, B, DB, df='hyperboloid')
model3 = build_model(R_obs, A, DA, B, DB, df='hyperbolic', link='softmax_fixed')
model4 = build_model(R_obs, A, DA, B, DB, df='hyperboloid', link='softmax_fixed')
``````

I think I’m happy and set to go with model composition now. I can imagine this approach might get clumsy as the number of alternative functions increases, but I might try a more explicitly object oriented approach (objects containing the deterministic functions and the relevant priors) at that point.

Thanks

1 Like