# Linear Regression With Positivity Constraint

Hi All,

I’m looking to build a multivariable linear model. Historically I have used the GLM to do this. Unfortunately I need to apply a positivity constraint to my feature coefficients in order to ensure the direction makes fundamental sense.

Ideally I’d look to do this by using a normal distribution for my intercept and a half t distribution of the coefficient priors. This seems not to be possible under the GLM and I’m sturggeling to hand turn it.

Would someone be able to point me in the right direction?

Thank

You can do it using GLM, something like:

``````with pm.Model() as m:
pm.glm.GLM.from_formula(
'y ~ X1 + X2 + X3', data=pandas_tbl,
priors={'Intercept': pm.Normal.dist(mu=0, sd=5),
'X1': pm.HalfStudentT.dist(sd=10, nu=10),
'X2': pm.HalfStudentT.dist(sd=10, nu=10),
'X3': pm.HalfStudentT.dist(sd=10, nu=10)}
)

``````

if you need interaction (eg, `X1*X2`), you need to multiple the columns and save it in `pandas_tbl`.

1 Like

Unfortunatley this has been throwing an error:

ValueError: Cannot compute test value: input 0 (__logp) of Op Elemwise{sub,no_inplace}(__logp, __logp) missing default value.
Backtrace when that variable is created:

I got the same issue when I tried to constrain the distribution for the priors and just assumed it wasn’t supported?

PYMC3 version 3.6, Theano 1.0.4, Python 3.6.5

This works for me on master:

``````pandas_tbl = pd.DataFrame(np.random.randn(100, 4), columns=['y', 'X1', 'X2', 'X3'])
with pm.Model() as m:
pm.glm.GLM.from_formula(
'y ~ X1 + X2 + X3', data=pandas_tbl,
priors={'Intercept': pm.Normal.dist(mu=0, sd=5),
'X1': pm.HalfStudentT.dist(sd=10, nu=10),
'X2': pm.HalfStudentT.dist(sd=10, nu=10),
'X3': pm.HalfStudentT.dist(sd=10, nu=10)}
)
trace = pm.sample()
``````

I can replicate your example and it restructures for my code successfully. As such I’m going to assume I have an issue elsewhere in my code band work backwards from this point. Thanks a real lot for your input - it’s great to know the GLM is this flexible.