I’m trying to fit between 40 and 64 parameters in a physical model, using the DEMetropolisZ sampler with a black-box likelihood (negative root-mean squared error). Most of the parameters have strong, log-normal priors. DEMetropolisZ runs without error but when examining the posteriors, it appears that the initial value is propagated throughout the chain (i.e., 0% of the samples are accepted). If I fix all but 8 of the free parameters, then I get good mixtures, i.e., >0% of the samples are accepted).
In reading related topics in this forum, there is reference to “high-dimensional” problems causing problems for samplers. I don’t know if 40 free parameters qualifies as “high-dimensional" but is this a known issue? What should I try instead?
The model looks something like the following:
import pymc as pm
import pytensor.tensor as pt
with pm.Model as model:
alpha = external_parameters_list[0] # A scalar, example of fixed parameter
beta = pm.LogNormal('beta', **external_prior_dict['beta'])
... # 39 more free parameters
params_list.append(alpha, beta)
params_tensor = pt.as_tensor_variable(params_list)
pm.Potential('likelihood', black_box_likelihood(params_tensor))
The DEMetropolisZ sampler is initialized with its defaults (3 chains, tuning the “scaling” hyperparameter).