Updating parameters in Bayesian approach

Hello everyone, I am new member user.

I have a question related to this topic: How to wrap a JAX function for use in PyMC — PyMC example gallery

In that example, the likelihood is used to estimate the parameters of the Hidden Markov Model (HMM). But, I do not understand what is the purpose of the gradient function within the HMM class. Is the gradient used for updating each parameter in each sample or iteration?

Thank you in advance!

Gradients are used by the NUTS algorithm, which is a fancy way to choose a sequence of parameter values so that the distribution of the sequence is equal to the posterior probability of the parameters given the data and the priors.

There’s a intuitive discussion about how it works in this presentation. The whole video is good if you want a deeper understanding