Maximum likelihood estimation of a Bayesian model

I’m sure this question is anathema to many of you.

Is there a way to obtain maximum likelihood estimates, or even just the likelihood given some parameter values, of a model in PyMC3?

For context, I have an idea about how to model some data which works as a Bayesian hierarchical model in PyMC3. However, I work in a field that is still very much dominated by p values, and my concern is that readers, and even some of my coauthors, will not accept this model if it is Bayesian, at least not initially. I would like the initial presentation of the model to be as accessible as possible for my audience, and unfortunately that means using MLE and p values.

I know how to code this model in PyMC3 but not outside of it (e.g., using just SciPy).

1 Like

If you replace all the prior in your model using Flat prior and then run pm.find_MAP, you would get the MLE if the optimizer converged. If your model is not too difficult, it should work, but make sure you compare with the result using sampling.

As for p value it is more difficult, permutation might work but again, depending on your model.

Thanks! Unfortunately the model is quite complex and pm.find_MAP doesn’t work when I replace the priors with flat priors, however it samples just fine. So can I just record the logp of all the samples and pick the parameter values of the sample with the maximum? Or is there some reason this won’t be the true MLE?

It general is not considered as MLE, but MAP.
A work around is to get the maximum and do a small local search around the maximum.

This is interesting stuff. When publishing, the ability to do MAP or MLE is actually very useful in comparing to previous approaches.

Could you just expand a bit on why that wouldn’t correspond to MLE with flat priors?

What I meant is it only considered as MLE if there is not prior information (or flat prior). As you are only evaluating the likelihood function and trying to find the maximum.

1 Like

Do you have any idea if there is a way to extract the likelihood/logpt function from a model? I don’t mean the values, rather the equations from Theano (presumably).

Yes you can get the logpt from model, which is a theano tensor. To evaluate the tensor you can compile it as a function, eg see:

1 Like

Sorry, I’m having difficulty expressing myself clearly. I want to find out the equation that is generating the values for a given model. In simple models, such as a non-hierarchical linear regression, the likelihood function is obvious (just the univariate normal pdf). However for my model, which is complex and hierarchical, I can build it in pymc3 easily but don’t know how to write the likelihood symbolically. Presumably there is such a symbolic expression of the likelihood/posterior within Theano somewhere.

1 Like

I dont think the symbolic expression is available (at least not from theano). If you know sympy that would be a good direction to try.

Thanks! It doesn’t look like sympy is powerful enough yet for my purposes since its matrix support is limited. Fortunately though I was able to find the equations in an obscure journal article from a decade ago, so all is well!

Hmmm, looks like my model.logpt is routinely > e with flat priors. I’d assumed it must be << e. Is this a bug or my mistaken thinking?

Edit: Nevermind, I was being an idiot.