Marignalization of Nuisance parameters

Hi

I’m starting to use PyMC3 and have a question regarding marginalization of nuisance parameters. In my model, I try to find values for Michaelis-Menten kinetics given experimental data. In the example notebook below I create artificial data with a known distribution and error. In a real experiment, I would not necessarily know the real error but I still want to consider it in the analysis. As I understand it the correct way to deal with this is to introduce a nuisance parameter for the error that I assign a prior distribution. The PyMC3 model would then be:

    with basic_model:
        # parameters we want to estimate
        vmax = pm.Uniform('Vmax', 0, 1)
        km = pm.Uniform('k_M', 0, 1)
        # our observable
        f = vmax * S / (km + S)
        # noise (nuisance parameter)
        est_sigma = pm.Uniform('sigma', 0, .1)
        # our likelihood where we add our observed data
        likelihood = pm.Normal('L', mu=f, sd=est_sigma, observed=rate)

Here I do fit as well the error est_sigma but I’m not interested in that value. Only in vmax and km. To get a posterior that doesn’t depend on est_sigma anymore I would have to integrate the resulting posterior over all possible values of est_sigma. Is there a possibility to do this in PyMC3?

Here is the full example as a gist.

When you are looking at the samples in the trace in isolation, they are already marginalized over all the latent parameters. Which means when you are doing like plt.hist(trace['Vmax']), this is already the marginal posterior distribution of Vmax.

1 Like

Oh, that is good to know. I wasn’t aware the trace does that. Is this
documented somewhere? I didn’t find it looking at the API reference.

@junpenglao: Looks like we need another docs PR!