Scipy.optimize.nnls usage problem in pymc3

Hello everyone,

I am trying to implement a graphical modelling in pymc3 and I probabilistically generate a matrix and a vector. I need to find a third vector which provides A*x=b equality where x has nonnegative elements. So, I have tried to use scipy.optimize.nnls() function but due to the pm.Model structure, my A and x matrices have zero shapes. The given sample script below provides ‘ValueError: expected matrix’ error since ‘mat’ matrix is empty. Does anyone have a solution for this problem?

with Normal_model:
    mu = pm.Uniform('mu', lower=-10, upper=10, shape=5)
    sigma = pm.Uniform('sigma', lower=0, upper=10, shape=2)
    mat = pm.Normal('matrix', mu=mu[0], sd=sigma[0], shape=(5,2))
    t = pm.Deterministic('parametre', optimize.nnls(mat, mu))
    for i in range(0,2):
        data = pm.Normal('data_%i'%i, mu=mu[i], sd=sigma[i], observed=s[i, :])
    start = pm.find_MAP()
    step = pm.Metropolis()
    trace = pm.sample(10000, step, start=start, progressbar=True)

Did you try to warp it as a theano op using as_op? There are similar discussion here you might found useful: Fitting a distribution with custom functions

Also, just a small tip: you can wrap your code as a Markdown code broke https://github.com/adam-p/markdown-here/wiki/Markdown-Cheatsheet#code

You can also try to use tt.nlinalg.MatrixPinv:

t = pm.Deterministic('parametre', tt.nlinalg.MatrixPinv()(mat).dot(mu))

I’m guessing this is less stable than the scipy function, but it might be good enough.

Hi aseyboldt,

Since I need a non-negative result, I can’t only take the pseudo inverse and nnls function does what exactly I need.
Anyways, thanks for your consideration :slight_smile:

Ah, I didn’t read the docstring of nnls properly, and missed the x>=0 part. If you can get by without gradient-based methods this should work:

@theano.as_op([tt.dmatrix, tt.dvector], [tt.dvector])
def nnls(mat, mu):
    return optimize.nnls(mat, mu)[0]

with pm.Model():
    mu = pm.Uniform('mu', lower=-10, upper=10, shape=5)
    sigma = pm.Uniform('sigma', lower=0, upper=10, shape=2)
    mat = pm.Normal('matrix', mu=mu[0], sd=sigma[0], shape=(5,2))
    t = pm.Deterministic('parametre', nnls(mat, mu))
    #...
    trace = pm.sample(step=pm.Metropolis())

If you need gradients, you’ll have to do a bit more work. This might be of help, it contains a section about inequality constraints (haven’t read that part though).

1 Like

It works like a charm thanks a lot! :+1:t4: