Question about DensityDist, again

Hi,
I have a model which is a monoexponential decay as a function of some b-values b for which I want to estimate the decay constant ADC. The model as a function of the b_values is like:
y_i=s_0\exp\{-b_i\times ADC\} + \epsilon_i
where \epsilon is a i.i.d Gaussian noise with standard deviation \sigma:\epsilon \sim \mathcal{N}(0,\sigma^2). I give more detail about this model here: How can I speed up the execution of my model?
Assuming a g-priror for \sigma and s_0, we can marginalize those two parameters and we obtain the marginalized likelihood:

p(\mathbf{y}|ADC)\propto \left[ \mathbf{y}^t\mathbf{y} - \frac{\left(\mathbf{y}^t\mathbf{f}\right)^2}{\mathbf{f}^t\mathbf{f}}\right]^{-N/2}

where \mathbf{y} is the vector containing the observation y_i, N is the number of observations, and \mathbf{f}=\exp(-b\times ADC) is the vector of the model sampled at the observed b-values, with size N.

I tried to implement this in PyMC3 with DensityDist, but I get an error. Here is the code:

Imports

%matplotlib inline
import numpy as np
import pymc3 as pm
import matplotlib.pyplot as plt
import theano.tensor as tt
import theano
plt.style.use('seaborn-darkgrid')

Data generation

bvalues=np.array([0,10,20,50,100,200,500,1000],dtype=np.float)  # vector of b-values
N=bvalues.size # number of bvalues

ADC=2e-3 
S0=1
Sb=S0*np.exp(-bvalues*ADC) # noise free signal
sigma=0.1

data = np.random.normal(loc=Sb, scale=sigma) # add noise to the signal

PyMC3 model

    with  pm.Model() as marginal_likelihood_model:
        adc = pm.Lognormal('adc', np.log(4e-3), sigma=1)

        # loglikelihood function with theano
        def my_loglike_tt(adc, data):
            f = tt.exp(-bvalues * adc)
            yty = tt.sum(data**2,)
            ftf = tt.sum(f**2)
            fty = tt.sum(data*f)
            logp = -N / 2 * tt.log(yty - fty * fty / ftf)
            return logp

        # loglikelihood function with numpy
        def my_loglike_np(adc, data):
            f = np.exp(-bvalues * adc)
            yty = (data ** 2).sum()  # y.T @ y
            fty = (data* f).sum()  # y.T @ f
            ftf = (data** 2).sum()  # f.T @ f
            logp = -N / 2 * tt.log(yty - fty * fty / ftf)

            return logp

        likelihood = pm.DensityDist('likelihood', my_loglike_tt, observed={'adc': adc, 'data': data})

    with marginal_likelihood_model:
        marginal_likelihood_trace = pm.sample(2000,tune=1000)

    pm.traceplot(marginal_likelihood_trace, var_names=['adc'],coords={'adc_dim_0':0})

When I execute this code, the model compiles correctly but I get an error at the inference step.

Any idea an how to make it work?

Thanks!

Hi,
I could find a way for my code to work. Setting the number of cores to 1 cores=1 seems to solve the problem. I don’t know why but I found that most of my models fail if I don’t set the number of cores to 1. Anyone got an idea on the reason why I have such a behaviour? I am on windows and I run my code in PyCharm.

Try moving the custom log_prob function outside of the pm.Model block “Lambdas defined in local namespace are not picklable (see issue #1995)”
ref:

1 Like

Thanks, it works! Your help is much appreciated!:slightly_smiling_face:

1 Like

I have one more question though:
How can I pass other parameters that are not random variables or observed data to my function?
Thanks!

You can take a look at this post first: How to set up a custom likelihood function for two variables