# NUTS and zero derivative error

When sampling using NUTS I keep getting this ValueError: Mass matrix contains zeros on the diagonal. Removing the jitter in the initialization does not eliminate this error. I tried what was recommended here: Limit or prevent unrealistic output of neural network but it didn’t work. What is the reason for this error and is there a way to fix it? Thank you so much!!

2 Likes

One of the reason is some prior being too uninformative and the derivative is too small. What kind of model you are testing, and also what is the full error message?

Thanks so much for writing back! The full error message is:
ValueError: Mass matrix contains zeros on the diagonal.

The derivative of RV `ind` .ravel()[0] is zero.
The derivative of RV `flu` .ravel()[0] is zero.
(on and on with all the indices where the derivative is zero).
The model is a stochastic block model. NUTS takes four variables and two of them have these derivate equals zero errors.

Can the problem is ‘derivative being too small’ be solved by changing the numerical resolution of Python? The prior works on the subset of data, but when I increase the data used, it does not work? (The prior should not be very uninformative I thought from this)

It depends on your model, but in most cases this error is cause by your prior being too vague. You can try fixing the RV to a constant and narrow down the problem. Also you can search the discourse to have a look at related discussions.

1 Like
``````with pm.Model() as GVmodel:
mu = pm.Normal('mu', mu=0, sigma=.01)
nu = pm.Normal('nu', (K/3 - 1)*dt, sigma= 0.1) # K:Kurtois of nu
You shouldnt assign step method yourself unless you have special use case - try just `trace = pm.sample()`