Is there an efficient way to get all of my linear models parameters distributions mean? I.e. analogue to sklearn's .coef_?

To add some more context to @ckrapu answer, take a look at Did something happen to plot_posterior?, TL;DR updating to the latest pymc3 version will have the alias like pm.summary back. They have not been present in the documentation for a while though, they only have links to the respective ArviZ docs: https://docs.pymc.io/api/stats.html.

To get the most out of ArviZ functions, you should convert your trace to inferencedata. The best way to do that is directly from pm.sample(return_inferencedata=True) which will become the default in pymc3 >=4.0

Also, summary will compute the means of all the variables in your trace, but not only that, it will also calculate rhat, ess and other quantities. With 20 parameters it won’t really matter, but for a large number of parameters, if you only want the means, it would be better to compute those with idata.posterior.mean(dim=("chain", "draw")). Or follow the guidance on Redirecting to new ArviZ documentation host: ReadTheDocs to have summary calculate only a subset of the columns.

2 Likes