Dealing with negative values and the exponential distribution

Thanks a lot for your suggestion! (And someone for formatting my post:))

I’ve experimented with it and I’m wondering whether it’s a problem that the resulting distribution (after taking the log) is not really Gaussian?
image

I’ve been playing around with a simplified version to understand log-transformations:

with pm.Model() as model0:
    a = pm.Normal("a", mu=6, sd=1)
    b = pm.Exponential("b", 1/2)
    sec = pm.Normal("sec", mu=a, sd=b, observed=np.log(d0.tdeltas+1e-8))
    
    trace0 = pm.sample(1000, tune=2000)

Whereas the empirical mean is close to 600, I can’t recover this accurately:

with model0:
    ppc = pm.sample_posterior_predictive(trace1)
    
az.plot_ppc(az.from_pymc3(posterior_predictive=ppc, model=model0));
print("mean=", np.exp(ppc["sec"]).mean())
mean= 734.969762489091

image

(Not sure why my posterior mean is larger than the empirical mean - the ppc plot seems to indicate the opposite.)

Anyway - I just wanted to check whether I’m misinterpreting something or whether this is an unavoidable downside of transforming my data?