Model posteriors are stuck at 0. Need help debugging

I am trying to model football matches such that the score of each team is drawn from a Poisson distribution where the parameter of the distribution depends on factors such as whether the team is playing at home, it’s attacking capability and it’s defensive capability.

The model spec looks like so:

with pm.Model() as football_model:
muatt = pm.Normal('muatt', 0, 0.1)
mudef = pm.Normal('mudef', 0, 0.1)
sigatt = pm.Exponential('sigatt', 1.0)
sigdef = pm.Exponential('sigdef', 1.0)
home = pm.Normal('home', 0, 1.0)

attack = pm.Normal('att', muatt, sigatt, shape=(teams.size,))
defense = pm.Normal('def', mudef, sigdef, shape=(teams.size,))

offset = tt.mean(attack) + tt.mean(defense)

home_team_theta = home + attack[df.home_team] + defense[df.away_team] - offset
away_team_theta = attack[df.away_team] + defense[df.home_team] - offset

score_home = pm.Poisson('score_home', home_team_theta, observed=df.home_goals)
score_away = pm.Poisson('score_away', away_team_theta, observed=df.away_goals)

The posterior distributions of the parameters seem to be just zeroes for some reason.
I tried to follow the debugging tutorial, but printing the values of the variables didn’t really help me.
Need some pointers on how to proceed.

Have you referenced the Rubgy predictions example notebook? Your model looks similar so I assume you have, but if not, it’s well worth a look.

From just from looking at this code snippet, the thetas look like they might be able to go negative, have you perhaps missed a transformation? e.g. log-link

I haven’t tbh. I was trying to do a pymc port of a similar Julia example here.

You are absolutely right. I was missing the fact a LogPoisson likelihood was being used in the tutorial.

Ah great, hth! This is the example notebook btw: A Hierarchical model for Rugby prediction — PyMC3 3.11.2 documentation