How to weight observations in v4?

The code below works in pymc3. When I try in version 4, I get a message about using pm.logp(rv, x) instead of rv.logp(x). So I naively tried:

obs = pm.logp(weights * pm.Normal.dist(mu=mu, sigma=sd), observed) but the sampler doesn’t seem to take this likelihood into account.

The code that works in pymc3:


import pymc3 as pm
import numpy as np

observed = np.random.normal(loc=2, scale=3, size=100000)
weights = np.random.uniform(size=100000)
# bump the highly weighted obs, drop the lower weighted ones
# we should see a posterior mu of >2
observed[weights > .5] += .1
observed[weights < .5] -= .1

with pm.Model() as m:
    mu = pm.Normal('mu', mu=0, sd=5)
    sd = pm.HalfNormal('sd', 6)
    obs = pm.Potential('obs', weights * pm.Normal.dist(mu=mu, sd=sd).logp(observed))

with m:
    trace = pm.sample(return_inferencedata=True)

posterior = trace.posterior.stack(samples=["chain", "draw"])
np.median(posterior['mu'])

2 Likes

I figured it out:

obs = pm.Potential('obs', (weights * pm.logp(pm.Normal.dist(mu=mu, sigma=sd), observed)).sum())

Are you saying the sum was the solution? That shouldn’t be needed (because it will be automatically summed anyway)

Edit: I see you had the weights inside the logp, yes that would be different