Wrapping a deterministic function based on python classes

I am interested in applying model fitting using the Theory of mind model implemented in the tomsup package.

Ideally, the goal would be to wrap it in the theano as_op, but when I try I run into the error:

NotImplementedError: input nd
Full traceback

NotImplementedError Traceback (most recent call last)

[/usr/local/lib/python3.7/dist-packages/theano/compile/function/types.py](https://cxwgw1zsbqe-496ff2e9c6d22116-0-colab.googleusercontent.com/outputframe.html?vrz=colab-20210806-060143-RC00_389142750#) in __call__(self, *args, **kwargs) 974 self.fn() --> 975 if output_subset is None 976 else self.fn(output_subset=output_subset)

NotImplementedError: input nd

During handling of the above exception, another exception occurred:

NotImplementedError Traceback (most recent call last)

8 frames

[<ipython-input-10-11c6e0567299>](https://cxwgw1zsbqe-496ff2e9c6d22116-0-colab.googleusercontent.com/outputframe.html?vrz=colab-20210806-060143-RC00_389142750#) in <module>() 1 with basic_model: 2 # draw 500 posterior samples ----> 3 trace = pm.sample(500, return_inferencedata=False)

[/usr/local/lib/python3.7/dist-packages/pymc3/sampling.py](https://cxwgw1zsbqe-496ff2e9c6d22116-0-colab.googleusercontent.com/outputframe.html?vrz=colab-20210806-060143-RC00_389142750#) in sample(draws, step, init, n_init, start, trace, chain_idx, chains, cores, tune, progressbar, model, random_seed, discard_tuned_samples, compute_convergence_checks, callback, jitter_max_retries, return_inferencedata, idata_kwargs, mp_ctx, pickle_backend, **kwargs) 426 start = deepcopy(start) 427 if start is None: --> 428 check_start_vals(model.test_point, model) 429 else: 430 if isinstance(start, dict):

[/usr/local/lib/python3.7/dist-packages/pymc3/util.py](https://cxwgw1zsbqe-496ff2e9c6d22116-0-colab.googleusercontent.com/outputframe.html?vrz=colab-20210806-060143-RC00_389142750#) in check_start_vals(start, model) 232 ) 233 --> 234 initial_eval = model.check_test_point(test_point=elem) 235 236 if not np.all(np.isfinite(initial_eval)):

[/usr/local/lib/python3.7/dist-packages/pymc3/model.py](https://cxwgw1zsbqe-496ff2e9c6d22116-0-colab.googleusercontent.com/outputframe.html?vrz=colab-20210806-060143-RC00_389142750#) in check_test_point(self, test_point, round_vals) 1382 1383 return Series( -> 1384 {RV.name: np.round(RV.logp(test_point), round_vals) for RV in self.basic_RVs}, 1385 name="Log-probability of test_point", 1386 )

[/usr/local/lib/python3.7/dist-packages/pymc3/model.py](https://cxwgw1zsbqe-496ff2e9c6d22116-0-colab.googleusercontent.com/outputframe.html?vrz=colab-20210806-060143-RC00_389142750#) in <dictcomp>(.0) 1382 1383 return Series( -> 1384 {RV.name: np.round(RV.logp(test_point), round_vals) for RV in self.basic_RVs}, 1385 name="Log-probability of test_point", 1386 )

[/usr/local/lib/python3.7/dist-packages/pymc3/model.py](https://cxwgw1zsbqe-496ff2e9c6d22116-0-colab.googleusercontent.com/outputframe.html?vrz=colab-20210806-060143-RC00_389142750#) in __call__(self, *args, **kwargs) 1559 def __call__(self, *args, **kwargs): 1560 point = Point(model=self.model, *args, **kwargs) -> 1561 return self.f(**point) 1562 1563

[/usr/local/lib/python3.7/dist-packages/theano/compile/function/types.py](https://cxwgw1zsbqe-496ff2e9c6d22116-0-colab.googleusercontent.com/outputframe.html?vrz=colab-20210806-060143-RC00_389142750#) in __call__(self, *args, **kwargs) 989 node=self.fn.nodes[self.fn.position_of_error], 990 thunk=thunk, --> 991 storage_map=getattr(self.fn, "storage_map", None), 992 ) 993 else:

[/usr/local/lib/python3.7/dist-packages/theano/link/utils.py](https://cxwgw1zsbqe-496ff2e9c6d22116-0-colab.googleusercontent.com/outputframe.html?vrz=colab-20210806-060143-RC00_389142750#) in raise_with_op(fgraph, node, thunk, exc_info, storage_map) 506 # Some exception need extra parameter in inputs. So forget the 507 # extra long error message in that case. --> 508 raise exc_value.with_traceback(exc_trace) 509 510

[/usr/local/lib/python3.7/dist-packages/theano/compile/function/types.py](https://cxwgw1zsbqe-496ff2e9c6d22116-0-colab.googleusercontent.com/outputframe.html?vrz=colab-20210806-060143-RC00_389142750#) in __call__(self, *args, **kwargs) 973 outputs = ( 974 self.fn() --> 975 if output_subset is None 976 else self.fn(output_subset=output_subset) 977 )

NotImplementedError: input nd Apply node that caused the error: InplaceDimShuffle{x}(FromFunctionOp{wrapper}.0) Toposort index: 7 Inputs types: [TensorType(float64, scalar)] Inputs shapes: [()] Inputs strides: [()] Inputs values: [0.0] Outputs clients: [[Elemwise{Composite{Switch(i0, (i1 * ((i2 * i3 * sqr((i4 - (i5 + (i6 * i7) + (i8 * i9))))) + i10)), i11)}}(Elemwise{Composite{Cast{int8}(GT(i0, i1))}}.0, TensorConstant{(1,) of 0.5}, TensorConstant{(1,) of -1.0}, InplaceDimShuffle{x}.0, TensorConstant{[-0.480079...33830288]}, InplaceDimShuffle{x}.0, InplaceDimShuffle{x}.0, TensorConstant{[-1.205319...39202321]}, InplaceDimShuffle{x}.0, TensorConstant{[ 0.111508...02141149]}, Elemwise{Composite{log((i0 * i1))}}.0, TensorConstant{(1,) of -inf})]] Backtrace when the node is created(use Theano flag traceback__limit=N to make it longer): File "/usr/local/lib/python3.7/dist-packages/ipykernel/kernelbase.py", line 233, in dispatch_shell handler(stream, idents, msg) File "/usr/local/lib/python3.7/dist-packages/ipykernel/kernelbase.py", line 399, in execute_request user_expressions, allow_stdin) File "/usr/local/lib/python3.7/dist-packages/ipykernel/ipkernel.py", line 208, in do_execute res = shell.run_cell(code, store_history=store_history, silent=silent) File "/usr/local/lib/python3.7/dist-packages/ipykernel/zmqshell.py", line 537, in run_cell return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs) File "/usr/local/lib/python3.7/dist-packages/IPython/core/interactiveshell.py", line 2718, in run_cell interactivity=interactivity, compiler=compiler, result=result) File "/usr/local/lib/python3.7/dist-packages/IPython/core/interactiveshell.py", line 2822, in run_ast_nodes if self.run_code(code, result): File "/usr/local/lib/python3.7/dist-packages/IPython/core/interactiveshell.py", line 2882, in run_code exec(code_obj, self.user_global_ns, self.user_ns) File "<ipython-input-9-98b3e1716ac4>", line 11, in <module> mu = wrapper(alpha) + beta[0] * X1 + beta[1] * X2 HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.

I reduced this to a minimal case. First generate some data (same as in the getting started guide):

import arviz as az
import matplotlib.pyplot as plt
import numpy as np

# True parameter values
alpha, sigma = 1, 1
beta = [1, 2.5]

# Size of dataset
size = 100

# Predictor variable
X1 = np.random.randn(size)
X2 = np.random.randn(size) * 0.2

# Simulate outcome variable
Y = alpha + beta[0] * X1 + beta[1] * X2 + np.random.randn(size) * sigma

secondly create a dummy class a wrap the function I wish to apply in the as_op:

import theano.tensor as tt

from theano.compile.ops import as_op

class test_class():
  def __init__(self, k):
    self.k = k
  def apply(self, v):
    return self.k + v - self.k # some operation dependent on class

class_instance = test_class(k=1)

@as_op(itypes=[tt.dscalar], otypes=[tt.dscalar]) # dscaler = 64 float
def wrapper(v):
  return class_instance.apply(v)

lastly, create the model and (fail to) fit it:

basic_model = pm.Model()

with basic_model:

    # Priors for unknown model parameters
    alpha = pm.Normal("alpha", mu=0, sigma=10)
    beta = pm.Normal("beta", mu=0, sigma=10, shape=2)
    sigma = pm.HalfNormal("sigma", sigma=1)

    # Expected value of outcome
    mu = wrapper(alpha) + beta[0] * X1 + beta[1] * X2

    # Likelihood (sampling distribution) of observations
    Y_obs = pm.Normal("Y_obs", mu=mu, sigma=sigma, observed=Y)

# example fitting
with basic_model:
    trace = pm.sample(500, return_inferencedata=False)

Using simply a function as the wrapper which is not dependent on the class seems to work without issue, however, the problem seems the be when I introduce the class into the mix.

For the conv. of everyone involved I created a google colab with the code which reproduces the error.

I have also added a gist of what I would imagine the actual fitting of the theory of mind model from the package would look like.