Suppose there are three independent normal variables and their means are 0 and their standard deviations have independent logarithmic priors. Suppose we have one observation for each, 1.0, 1e6 and 1e-4.
import pymc as pm
def abs_log_minus(tensor):
return -pm.math.log(pm.math.abs(tensor))
with pm.Model():
std = pm.CustomDist("std", logp=abs_log_minus, shape=(3,))
pm.Normal("y", 0, std, observed=[1.0, 1e6, 1e-4])
pm.sample(random_seed=0)
How the logarithmic priors are specified is based on the first example in the section âArbitrary Distributionsâ of the notebook âIntroductory Overview of PyMCâ.
Running the code above causes
Traceback (most recent call last):
File "/home/jdawson/Documents/PyMC-bug/./bug.py", line 12, in <module>
pm.sample(random_seed=0)
File "/home/jdawson/Documents/PyMC-bug/.venv/lib/python3.11/site-packages/pymc/sampling/mcmc.py", line 619, in sample
model.check_start_vals(ip)
File "/home/jdawson/Documents/PyMC-bug/.venv/lib/python3.11/site-packages/pymc/model.py", line 1779, in check_start_vals
raise SamplingError(
pymc.exceptions.SamplingError: Initial evaluation of model at starting point failed!
Starting values:
{'std': array([ 0.17563482, 0.95619595, -0.09898284])}
Initial evaluation results:
{'std': 4.1, 'y': -inf}
The error occurs if there are three variables, as above, but not if there are two, as in
import pymc as pm
def abs_log_minus(tensor):
return -pm.math.log(pm.math.abs(tensor))
with pm.Model():
std = pm.CustomDist("std", logp=abs_log_minus, shape=(2,))
pm.Normal("y", 0, std, observed=[1.0, 1e6])
pm.sample(random_seed=0)
Why does this bug occur and how can one work around it?