# Divergences in Temperature Inversion Model

Hello, I have a model that attempts inversion of temperature data to infer material parameters. The parameters I am interested in are modelled as random variables and used for solving the time-dependent temperature-diffusion PDE at four (or more) time steps. I use the Backward Euler method for solving the PDE and therefore make use of theano operations set_subtensor(), nlinalg.matrix_inverse() and scan(). There is a heat source H_n which is a function of 5 of the variables i wish to infer:

\begin{align} H(t_n) = \begin{cases} \frac{H_{ult}}{2} \big( 1- \exp(-B \: t_n^C) \big) & \text{if } t < t_2 \\ \frac{H_{ult}}{2} \big( 1- \exp(-B \: t_n^C) \big) + \frac{H_{ult}}{2} \frac{t_n-t_2}{t_n-t_2+D} & \text{otherwise} \end{cases} \end{align}

Here is my PyMC3 model:

with pm.Model() as TempModel:
BoundedNormal = pm.Bound(pm.Normal, lower=1e-6, upper=1e6)
B = BoundedNormal(‘B’, mu=0.01172, sd=0.001172, testval=0.01172)
C = BoundedNormal(‘C’, mu=1.6, sd=0.16, testval=1.6)
D = BoundedNormal(‘D’, mu=6.2, sd=0.62, testval=6.2)
t2 = BoundedNormal(‘t2’, mu=3.5, sd=0.35, testval=3.5)
Hult = BoundedNormal(‘Hult’, mu=338e3, sd=338e2, testval=338e3)
rho = BoundedNormal(‘rho’, mu=2.4e3, sd=2.4e2, testval=2.4e3)
Cp = BoundedNormal(‘Cp’, mu=1e3, sd=1e2, testval=1e3)
kappa = BoundedNormal(‘kappa’, mu=6.48e3, sd=6.48e2, testval=6.48e3)

([Tn, Hn, dt, Ainv], _) = scan(
fn = CalTemp1D,
sequences = [shared(tSeq), shared(dtList)],
outputs_info = [T0, H0, dt0, A0],
non_sequences = [B, C, D, t2, Hult, cc, rho, Cp, kappa]

Tm = pm.Deterministic('Tm', Tn[:, cols])
sigma = pm.HalfNormal('sigma', sd=1., shape=(Nt,2))
T_obs = pm.Normal('T_obs', mu=Tm, sd=sigma, observed=Td)


With the above model i get almost 4000 divergences and trace plots that have sticky regions as seen in the following:

However, if i choose sigma to be a scalar random variable, there are no divergences. This does not produce the best results though as the sigma or error for each temperature data point are different. I attempted a reparameterisation without much success:

Tm = pm.Deterministic('Tm', Tn[:, cols])
sigma = pm.Normal('sigma', mu=0, sd=1., shape=(Nt,2))
T_tilde = pm.Deterministic('t_tilde', Tm+sigma)
T_obs = pm.Normal('T_obs', mu=T_tilde, sd=1e-3, observed=Td)


Anyone have any ideas how i might fix my divergences? Is it difficult for NUTS to find the posterior distribution for t_2 given the formula for H_n perhaps? Thanks in advance.