Question about Likelihood and Residuals

In Statistical Rethinking Richard McElrealth asks the question

Which line is likelihood?

y_i \sim Normal( \mu, \sigma) \\ \mu \sim Normal(0,10) \\ \sigma \sim Uniform(0,10)

The answer is the first line, which he states

The first line is the likelihood. The second line is very similar, but is instead the prior for the parameter μ. The third line is the prior for the parameter σ. Likelihoods and priors can look very similar, because a likelihood is effectively a prior for the residuals.

Does anyone know what he means with the last line? That a likelihood is effectively a prior for the residuals? The residuals of what?

You can ask him on twitter :wink: He has been advocating that distinction between prior and likelihood is unnecessary (see eg his talk https://youtu.be/yakg94HyWdE?t=11m56s), which I also agree. I think the aim here is more to realized that point.

Thanks @junpenglao. The video is helpful.

Do you know what residuals means in this context? With a linear model I get that residuals measures the difference between the actual and the estimate, but with bayesian models what is the definition of a residual?

It’s the same, so something like y_i - \hat{y} = y_i - \mu \sim \text{Normal}(0, \sigma)