Discrepency Between Hessian Estimation of Std. Dev. and Sampled Estimation

This fixed the immediate problem, but weirdness remains. Adding the default_transform did result in an appropriate std estimate from the Hessian method. But now the sampler is having divergence issues! Removing the default_transform and re-running fixed both the divergences and the Hessian implementation. No idea how, but toggling default_transform seems to have affected some internal state. I now get the correct result for all three methods with the original code.

For now, will mark this as the solution.