Hi all,
I’m new to PYMC. Finding the docs and forums to be enormously helpful. However I haven’t been able figure out how to build a model off of a custom function (I believe this is called an Op). Every time I run my code I get the error
Kernel Restarting
My attempt to do this has been based off of two tutorials:
- Introductory Overview of PYMC - I’m just using a sample multiple linear regression from this tutorial to test if I’m building the Class correctly (I’m not)
- Using a Black Box Likelihood Function - this is helping me understand how to write a Class that will allow me to pass Aesara tensors
The code is:
import pymc as pm
from pymc import *
import aesara.tensor as at
from aesara import function
import numpy as np
# ------ sample data pulled from regression example
%config InlineBackend.figure_format = 'retina'
# Initialize random number generator
RANDOM_SEED = 8927
rng = np.random.default_rng(RANDOM_SEED)
# True parameter values
alpha, sigma = 1, 1
beta = [1, 2.5]
# Size of dataset
size = 100
# Predictor variable
X1 = np.random.randn(size)
X2 = np.random.randn(size) * 0.2
# Simulate outcome variable
Y = alpha + beta[0] * X1 + beta[1] * X2 + rng.normal(size=size) * sigma
# ----- code largely pulled from the Class example
# create function to call
def linear_func(theta, x1, x2):
alpha, beta1, beta2 = theta
return alpha + beta1 * x1 + beta2 * x2
# linear func w aesara
reg_x1 = at.dcol('reg_x1')
reg_x2 = at.dcol('reg_x2')
reg_a = at.dscalar('reg_a')
reg_b = at.dscalar('reg_b')
reg_alpha = at.dscalar('reg_alpha')
reg_out = reg_alpha + reg_x1 * reg_a + reg_x2 * reg_b
reg_func = function( [reg_x1, reg_x2, reg_a, reg_b, reg_alpha], reg_out )
#define a aesara Op for our likelihood function
class MyNewClass(at.Op):
itypes = [at.dvector] # expects a vector of parameter values when called
otypes = [at.dscalar] # outputs a single scalar value (the log likelihood)
def __init__(self, chosen_func_a, x1, x2):
# add inputs as class attributes
self.chosen_func_b = chosen_func_a
self.x1 = x1
self.x2 = x2
def perform(self, node, inputs, outputs):
# the method that is used when calling the Op
(theta,) = inputs # this will contain my variables
# call the log-likelihood function
placeholder = self.chosen_func_b(theta, self.x1, self.x2)
outputs[0][0] = np.array(placeholder)
# create our Op (X1 & X2 already defined)
test_func = MyNewClass(linear_func, X1, X2)
# model
basic_model = pm.Model()
with basic_model:
# Priors for unknown model parameters
alpha = pm.Uniform("alpha", lower=-10.0, upper=10.0)
beta1 = pm.Uniform("beta1", lower=-10.0, upper=10.0)
beta2 = pm.Uniform("beta2", lower=-10.0, upper=10.0)
sigma = pm.HalfNormal("sigma", sigma=1)
# convert m and c to a tensor vector
theta2 = at.as_tensor_variable([alpha, beta1, beta2])
# Expected value of outcome
mu = test_func( theta2 )
# Likelihood (sampling distribution) of observations
Y_obs = pm.Normal("Y_obs", mu=mu, sigma=sigma, observed=Y)
with basic_model:
pm.sample()
I realize that I can simply embed the function inside PYMC but I specifically want to build a class. Any advice is appreciated.