Help Creating Custom Op

Hi all,

I’m new to PYMC. Finding the docs and forums to be enormously helpful. However I haven’t been able figure out how to build a model off of a custom function (I believe this is called an Op). Every time I run my code I get the error

Kernel Restarting

My attempt to do this has been based off of two tutorials:

  1. Introductory Overview of PYMC - I’m just using a sample multiple linear regression from this tutorial to test if I’m building the Class correctly (I’m not)
  2. Using a Black Box Likelihood Function - this is helping me understand how to write a Class that will allow me to pass Aesara tensors

The code is:

import pymc as pm
from pymc import *
import aesara.tensor as at
from aesara import function
import numpy as np


# ------ sample data pulled from regression example
%config InlineBackend.figure_format = 'retina'
# Initialize random number generator
RANDOM_SEED = 8927
rng = np.random.default_rng(RANDOM_SEED)

# True parameter values
alpha, sigma = 1, 1
beta = [1, 2.5]

# Size of dataset
size = 100

# Predictor variable
X1 = np.random.randn(size)
X2 = np.random.randn(size) * 0.2

# Simulate outcome variable
Y = alpha + beta[0] * X1 + beta[1] * X2 + rng.normal(size=size) * sigma


# ----- code largely pulled from the Class example
# create function to call
def linear_func(theta, x1, x2):
    alpha, beta1, beta2 = theta
    return alpha + beta1 * x1 + beta2 * x2

# linear func w aesara

reg_x1 =    at.dcol('reg_x1')
reg_x2 =    at.dcol('reg_x2')
reg_a =     at.dscalar('reg_a')
reg_b =     at.dscalar('reg_b')
reg_alpha = at.dscalar('reg_alpha')

reg_out =    reg_alpha + reg_x1 * reg_a + reg_x2 * reg_b
reg_func =   function( [reg_x1, reg_x2, reg_a, reg_b, reg_alpha], reg_out )


#define a aesara Op for our likelihood function
class MyNewClass(at.Op):

    itypes = [at.dvector]  # expects a vector of parameter values when called
    otypes = [at.dscalar]  # outputs a single scalar value (the log likelihood)

    def __init__(self, chosen_func_a, x1, x2):

        # add inputs as class attributes
        self.chosen_func_b = chosen_func_a
        self.x1 = x1
        self.x2 = x2

    def perform(self, node, inputs, outputs):
        # the method that is used when calling the Op
        (theta,) = inputs  # this will contain my variables

        # call the log-likelihood function
        placeholder = self.chosen_func_b(theta, self.x1, self.x2)

        outputs[0][0] = np.array(placeholder) 


# create our Op (X1 & X2 already defined)
test_func = MyNewClass(linear_func, X1, X2)


# model
basic_model = pm.Model()

with basic_model:

    # Priors for unknown model parameters
    alpha = pm.Uniform("alpha", lower=-10.0, upper=10.0)
    beta1 = pm.Uniform("beta1", lower=-10.0, upper=10.0)
    beta2 = pm.Uniform("beta2", lower=-10.0, upper=10.0)
    sigma = pm.HalfNormal("sigma", sigma=1)
    
    # convert m and c to a tensor vector
    theta2 = at.as_tensor_variable([alpha, beta1, beta2])

    # Expected value of outcome
    mu = test_func( theta2 )

    # Likelihood (sampling distribution) of observations
    Y_obs = pm.Normal("Y_obs", mu=mu, sigma=sigma, observed=Y)
    
with basic_model:
    pm.sample()

I realize that I can simply embed the function inside PYMC but I specifically want to build a class. Any advice is appreciated.

If you are new to PyMC you shouldn’t have to write your own Op. Most people manage just fine with composing pymc.math calls, or aesara.tensor. You should find there equivalents to nearly all of the numpy functions, which you can use with PyMC variables.

This is the recommended way because it will generate much more efficient code, and PyMC can take gradients automatically. It’s also easier :wink:

You only need to implement an Op when you need a function that is not available in those.

@ricardoV94 - thank you for the reply! You make some great points. Along the lines of what you’re suggesting, what I ultimately want is to make an aesara.tensor function that I can call many times within my PyMC model. However when I define the aesara.tensor outside the model and then call (inside the model) I get the following error:

Expected an array-like object, but found a Variable: maybe you are trying to call a function on a (possibly shared) variable instead of a numeric array?

My research has led me to believe that I need to write an Op to handle the formatting as variables are being passed from PyMC to aesara and back. Is there an easier way to go about this?

Thanks for the help.

In your original example I see you tried to compile (by calling function) something that resembles your linear_func. In PyMC you don’t need to do that. You can simply write

def linear_func(theta, x1, x2):
    alpha, beta1, beta2 = theta
    return alpha + beta1 * x1 + beta2 * x2

with pm.Model() as m:
    alpha = pm.Normal("alpha")
    beta1 = pm.Normal("beta1")
    beta2 = pm.Normal("beta2")
    expected = linear_func((alpha, beta1, beta2), x1, x2))
   
    noise = pm.HalfNormal("noise")
    llike = pm.Normal("llike", expected, noise, observed=data)

PyMC will then compile the right functions itself when it needs to. As long as any operation in linear_func uses aesara operators (we overload most of the Python operators, so this also applies to +, * and so on) you should be fine.

Most users would not bother writing linear_func as a separate function but simply do the operations in the Model. Usually such refactorings are only necessary for other packages or very complicated functions.