# Feeding NUTS initialization of model parameters

I have a very computationally expensive likelihood function. Using scipy’s maximum likelihood, I’m able to get a good estimate of my parameters. However, I’d like to get the distribution of those parameters and better statistics using NUTS. Is there a way to give NUTS my maximum likelihood estimates as a place to start? I think this would speed up NUTS significantly for the purposes of what I’m attempting to do given my likelihood function’s high computational cost. I’d rather not use MH-MCMC if possible. If NUTS doesn’t work, what’re other MCMC algorithms available through PYMC3 that allows me to use my maximum likelihood optimization outputs?

Thanks so much,
Zach

Yes, you can specify the `start` in `pm.sample`. `pm.find_MAP` could be used but we in general discourage strongly on using the MAP as the starting point. The reason being that the gradient around the MAP is usually very small (or even 0), which makes tuning difficult and the possibility of increasing numerical error. I recommend you to try with the out-of-the box default first (i.e., just running `pm.sample(1000, tune=1000)`).

Trying it out of the box on an HPC led to, at best, 6 hours of running for a single chain, and, at worst, 384 hours, again, for a single chain, depending on the synthetic data used. My likelihood function requires arbitrary precision and utilizes the higher-order chain rule a la Faa Di Bruno, which necessitates many operations even with using lookup tables because very very high derivatives are necessary in just the likelihood calculation. Because of that, all calculations are very precious and time consuming. Giving NUTS even a slightly perturbed ML-esimtated start position should lead it into a better direction hopefully. I anticipate with real data, the time could be much longer. Thank you so much for your remarks! I appreciate it. My other option is, instead, to code up the likelihood function in C++ and have NUTS call that function.

1 Like