PyMC3 does not sample from my custom prior

Hi, I wanna design a special prior for my data. However, it seems PyMC3 does not sample from my prior.
I am giving a simple example below.

class MyPrior(pm.Continuous):
    def __init__(self, mu, sd, *args, **kwargs):
        super().__init__(*args, **kwargs) = mu = sd
        self.dist = pm.Normal.dist(mu=mu, sd=sd)

    def random(self, point=None, size=None):
        tmp = self.dist.random(point=point, size=size)
        return tmp

    def log_normal(self, mu, sd, val):
        const = -0.5 * (np.log(2*np.pi) + 2*np.log(sd))
        exp = -0.5 * (val-mu)**2 / sd**2
        return const + exp
    def logp(self, value):
        print('## logp')
#         return self.log_normal(,, value)
        return self.dist.logp(value)

with pm.Model() as model:
#     mu = pm.Normal('mu', mu=0, sd=10)
    mu = MyPrior('mu', mu=0, sd=10, testval=0)
    sd = pm.HalfNormal('sd', sd=10)
    obs = pm.Normal('obs', mu=mu, sd=sd, observed=np.random.randn(100))
    trace = pm.sample(500, tune=1000, cores=2)

When I run this code, it only prints '## logp' 3 times at the begining, and never prints tmp, which means the model only runs my logp for test values and never samples from my prior.
Do I impl the prior correctly? Thanks for the help in advance.

I dont think you need to worry about the number of times logp() being called - for what it’s worth, inference in pymc3 does not repeatedly called the logp method, as the logp is represented as a tensor, and it’s the tensor that will be repeatedly evaluated. In another word, logp only called in the model set up, but not during the actually inference. See for more details.

if you want to log the logp, you can add below into your model:

with ...
    mu_logp = pm.Deterministic('mu_logp', mu.distribution.logp(mu))
1 Like