Using Custom blackbox likelihood and prior functions

Hi,

I am looking to do a simple MCMC run to sample from a posterior distribution using 1000 samples extracted from a previous analysis as starting points. My posterior function (likelihood and posterior) uses numpy and scipy as well as if statements and since I also rely on scipy.root simply rewriting the function to work with aesara is not possible. I do of course realize that this will decrease performance, but I would like to implement this regardless. However, everything I have tried still throws errors such as “object of type ‘TensorVariable’ has no len()”, which indicates that my variables are still interpreted as tensor variables. Here is a short version of my current code:

with pm.Model() as model:
# Define parameters
x = pm.Flat(“x”, shape=(run.n_dim,))

# Add log-likelihood and log-prior as potentials
pm.Potential("log_likelihood", run.log_like(x))
pm.Potential("log_prior", prior.logpdf(x))  # your log prior function

# Use Metropolis sampler
step = pm.Metropolis()

# Set up starting points from predefined 1000 samples
start_points = [{"x": pt} for pt in samples]

# Sample
trace = pm.sample(draws=1000, tune=500, step=step, chains=1000, cores=your_cores_count,
                  start=start_points, progressbar=True, discard_tuned_samples=True)

I have tried to wrap the functions using @as_op(itypes=[at.dvector], otypes=[at.dscalar]) and nothing seems to work.

Is what I’m looking for even possible with pyMC?

The example notebook walks slowly over the steps to create your own op and test the implementation: Using a “black box” likelihood function — PyMC example gallery