I ran into a question on StackOverflow where someone wanted to impose a strong multivariate Gaussian prior on the coefficients in a binomial regression model. I coded up the following PyMC3 model as a suggested solution for them:

```
import numpy as np
import pymc3 as pm
import theano
import theano.tensor as tt
import matplotlib.pyplot as plt
import arviz as az
# Data
X = np.array([-0.86, -0.30, -0.05, 0.73])
N = np.array([5, 5, 5, 5])
Y = np.array([0, 1, 3, 5])
# augment X for simpler regression expression
X_aug = tt.stack(np.ones_like(X), X).T
# Prior params
mu = np.array([0, 10])
sd = np.array([22, 102])
Rho = np.array([[1, 0.5],[0.5, 1]])
Sigma = np.outer(sd, sd) * Rho
with pm.Model() as binomial_regression:
# regression coefficients (strong prior)
beta = pm.MvNormal('beta', mu=mu, cov=Sigma, shape=2)
# death probability
theta_i = pm.Deterministic('theta_i', pm.invlogit(X_aug.dot(beta)))
# outcomes
y_i = pm.Binomial('y_i', n=N, p=theta_i, observed=Y)
trace = pm.sample(10000, tune=100000, target_accept=0.8, random_seed=2018)
```

However, there were many divergences without a high number of tuning steps, and even then I still got one. I donâ€™t find the location of the divergence to be particularly insightful:

I do, however, find the posterior to be insightful about the inappropriateness of the prior, namely that the slope coefficient should really never be negative:

So, I thought a better solution that would still allow for the modelerâ€™s strong prior would be to use a truncated Gaussian that restricts the second coefficient to be nonnegative. However, Iâ€™m not sure how to do this, hence the question:

**How does one make a truncated/bounded multivariate normal in PyMC3?**

**Bonus Points:** If anyone has thoughts about the divergence issue, those are also welcome. Iâ€™m still pretty new to HMC/NUTS, so my experience with reparameterizing models is rather weak.