I’m playing around with transformations because I want to create a model where the values of a random variable are non-decreasing. Using the `ordered`

transformation in pymc3 works well but seems to put very little weight on very small differences between sequential elements.

Here is a little code and a figure to show what I mean:

```
import numpy as np
import matplotlib.pyplot as plt
from scipy.stats import norm, halfnorm
def ordered_bad(y):
"""This is the `backward` method of `Ordered` from pymc3.
"""
x = np.zeros_like(y)
x[..., 0] += y[..., 0]
x[..., 1:] += np.exp(y[..., 1:])
return np.cumsum(x, axis=-1)
def ordered_good(y):
"""I want something like this instead.
"""
x = np.zeros_like(y)
x[..., 0] += y[..., 0]
x[..., 1:] += np.abs(y[..., 1:])
return np.cumsum(x, axis=-1)
def diffs(x):
"""Differences between levels.
"""
return x[..., 1:] - x[..., :-1]
if __name__ == '__main__':
y = norm.rvs(size=(int(1e6), 4))
x1 = diffs(ordered_bad(y))
x2 = diffs(ordered_good(y))
plt.hist(x2[..., 0], normed=True, bins=200, label="includes 0", alpha=0.5)
plt.hist(x1[..., 0], normed=True, bins=2000, label="doesn't include 0", alpha=0.5)
plt.legend()
plt.xlim(0, 4)
plt.savefig("tmp.png")
```

The orange distribution is the one created by the `ordered`

transform applied to a normal random variable. It makes sense that it resembles a log-normal distribution because it includes `exp`

in its calculation. However I would really like something that the second transformation (where `exp`

is replaced with `abs`

). But will this cause problems for NUTS because the transformation can’t be reversed?