Hi,
I’m trying to use a pymc variable as an input to a theano function, so I can fit it as a free-parameter in my model. For some context, the idea for the model is that subjects are trying to learn about the variance of normal distribution and use that to make decision about stimuli, and I’m trying to model the length of time using a multilevel regression model. The term that I want to fit “attention scalar” is a population parameter (for now) which determines how much subjects are biased to think there is more or less variance than there actually is. This is the code:
`
with pm.Model() as full_model:
theano.config.compute_test_value = 'off'
## function for a update of gamma distribution on a single observation
def update_variance(observed_value, alpha, beta, bias_scalar):
alpha+=.5
beta+=((observed_value**2) * (1/bias_scalar))/2
#observed_variance=((beta/alpha))
return alpha, beta
## initialize beta and alpha to 1
beta_m = T.ones((292,17))
alpha_m = T.ones((292,17))
bias_scalar_m=pm.Normal('bias_scalar_m', 1, 1)
observations = T.matrix('observations')
# Symbolic description of the result
result, updates = theano.scan(fn=update_variance,
outputs_info=[alpha_m, beta_m],
sequences=observations,
non_sequences=bias_scalar_m)
run_learner = theano.function(inputs=[observations, bias_scalar_m], outputs=result, updates=updates)
alphas, betas=run_learner(lv_ct_array,bias_scalar_m)
learned_errors=np.sqrt(betas[:,1,:].transpose().flatten()/alphas[:,1,:].transpose().flatten())
## random effects
int_mu = pm.Normal('int_mu', 1, 1)
int_sd = pm.HalfCauchy('int_sd',5)
lvCT_mu = pm.Normal('lvCT_mu', 1, 1)
lvCT_sd = pm.HalfCauchy('lvCT_sd',5)
intercept = pm.Normal('intercept', mu=int_mu, sd=int_sd, shape=subjs)
lvCT = pm.Normal('lvCT', mu=lvCT_mu, sd=lvCT_sd, shape=subjs)
# Model error
epsilon = pm.HalfCauchy('epsilon',1e2)
# Expected value
y_hat = intercept[rt_regression_data.subjs_idx]+ lvCT[rt_regression_data.subjs_idx] * (rt_regression_data.lvCT/learned_errors)
alpha_est=(y_hat**2/epsilon)
beta_est=y_hat/epsilon
y_like = pm.Gamma('y_like\n', alpha=alpha_est, beta=beta_est, observed=rt_regression_data.rt.values)
full_model_trace=pm.sample(500, chains=2, cores=1, n_init=1000, tune=1000, target_accept=.95)`
When I pass the variable to the function I get the following error
TypeError Traceback (most recent call last)
in
31 run_learner = theano.function(inputs=[observations,attention_scalar_m], outputs=result, updates=updates)
32
—> 33 alphas, betas=run_learner(lv_ct_array,attention_scalar_m)
34
35 learned_errors=np.sqrt(betas[:,1,:].transpose().flatten()/alphas[:,1,:].transpose().flatten())
C:\Python\lib\site-packages\theano\compile\function_module.py in call(self, *args, **kwargs)
811 s.storage[0] = s.type.filter(
812 arg, strict=s.strict,
→ 813 allow_downcast=s.allow_downcast)
814
815 except Exception as e:
C:\Python\lib\site-packages\theano\tensor\type.py in filter(self, data, strict, allow_downcast)
85 if isinstance(data, Variable):
86 raise TypeError(
—> 87 'Expected an array-like object, but found a Variable: ’
88 'maybe you are trying to call a function on a (possibly ’
89 ‘shared) variable instead of a numeric array?’)
TypeError: Bad input argument with name “attention_scalar_m” to theano function with name “:31” at index 1 (0-based). Expected an array-like object, but found a Variable: maybe you are trying to call a function on a (possibly shared) variable instead of a numeric array?
Any suggestion on how to get this working, or how to restructure this to work would be greatly appreciated! I’ve already tried initiating the “bias_scalar_m” variable as a matrix and a vector, but to no avail.
Thank you in advance.