Likelihood with weighted observations


I am trying to create a likelihood statement where a small number of observations are weighted much more heavily than the rest (eg, 203 observations but the last 3 are weighted more). Can anyone advise on how to try this?

code snippet:

with pm.Model():
    x = pm.Normal('x',mu=0, sd=10, shape=100)
    y = pm.Normal('y',mu=0, sd=10, shape=100)

    F = pm.Deterministic('F', func) # returns values with shape=3

    like = pm.Deterministic('like', pm.math.concatenate([x, y, F]))
    obs = pm.Normal('obs', mu=like, observed = np.concatenate([x_obs, y_obs, F_obs]))

Many thanks

Not sure about your particular use case, but in general I would do it in 2 possible ways:

  1. replicate the observation in F, if say they are weighted 10 times more than the other observations then I just replicate the observation 10 times.
  2. weighting through a Gaussian. Think of the observations with more weight are measured by more precise equipment hence less noise:
with pm.Model():
    obs1 = pm.Normal('obs1', mu=xy, sd=10, observed=xy_obs)
    obs2 = pm.Normal('obs2', mu=F, sd=1, observed=F_obs)