Cannot convert Type TensorType

Hi folks. I’m struggling to develop the following model, and would love any and all insight ! Here’s the rundown.

  • every week, our salespeople sell a number of widgets
  • we assume that their weekly sales number is binomial : they have a large number of negotiations N going on at any one time, and each week, there’s a probability p of closing each deal
  • salespeople use productivity tools, and we assume that these tools may impact either the numbers of deals they’re negotiating at once ( N ) or the rate at which these deals get closed ( p )
  • each salesperson has their own N and p
  • the p are drawn from a beta distribution
  • the N are drawn from a Poisson distribution

The setup is as follows.

  • closed, an array of n_weeks rows and n_salespeople columns, where closed[i, j] is the integer number of widgets sold by salesperson j in week i
  • productivity, an array of the same shape as closed; the value productivity[i, j] is 1 if salesperson j is using a productivity tool in week i, and 0 otherwise

Finally, our model :

n_weeks, n_salespeople = closed.shape
salespeople_shape = (1, n_salespeople)

with pm.Model() as model:

    # Estimate N * p per salesperson
    Np = closed.mean(0).reshape(salespeople_shape)

    # p is beta-distributed
    p_ = pm.Beta("p", alpha=1.5, beta=20.0, shape=salespeople_shape, testval=0.05)

    # N is based on Np and p
    N_ = pm.Poisson("N", mu=Np / p_, shape=salespeople_shape, testval=Np / 0.05)

    # Productivity tool effect
    delta_ = pm.HalfNormal("delta", sd=0.5, testval=0.03)
    gamma_ = pm.HalfNormal("gamma", sd=0.5, testval=0.03)
    p_bonus = 1 + delta_ * productivity
    N_bonus = 1 + gamma_ * productivity

    # Observed
    y = pm.Binomial("y", n=N_ * N_bonus, p=p_ * p_bonus, observed=closed)

This code is getting me what looks to be a broadcasting issue : TypeError: Cannot convert Type TensorType(int64, matrix) (of Variable N_shared__) into Type TensorType(int64, row). You can try to manually convert N_shared__ into a TensorType(int64, row).

I’m not able to figure out why this is occurring. I’m assuming broadcasting works as it does in numpy, and this looks like a broadcast error, but I’m not managing to find it. Any help much appreciated !

As a bonus question, I’d also love some advice on the best way to set priors on the alpha and beta variables of the p above. I’ve set them to 1.5 and 20, as a vague estimate, but I’d love to put priors on these. What types of prior might be good here ?

Thanks so much for any help.
Quentin

I’d say that error occurs here: N_ * N_bonus where N_ is a matrix, but N_bonus is a row.
Is shape of productivity and N_bonus as you expect? Consider adding a singleton dimension so it broadcasts correctly.

Thanks @bdyetton. N_ is of shape (1, n_salespeople) so it should have a singleton dimension that is broadcastable to the dimension of productivity, which is (n_weeks, n_salespeople). I therefore don’t think this is the issue, although I am having problems confirming because N_bonus is not a straightfoward theano variable but a Elemwise{add,no_inplace}.0

Coolio. Maybe you can just print it to confirm?
tt.printing.Print('N_bonus')( N_bonus)

Thanks. Using that shows that things are correctly shaped and broadcastable. Here, I’m using your suggestion to print the preview of the vars, and below, I’m using print(N_.dshape).

Np shape (np.ndarray.shape) : (1, 2149)

p_ __str__ = [[ 0.05  0.05  0.05 ...,  0.05  0.05  0.05]]
(1, 2149)

N_ __str__ = [[40  4 31 ..., 16 18  8]]
(1, 2149)

N_bonus __str__ = [[ 1.    1.    1.   ...,  1.    1.    1.  ]
 [ 1.    1.    1.   ...,  1.    1.    1.  ]
 [ 1.    1.    1.   ...,  1.    1.    1.  ]
 ..., 
 [ 1.    1.    1.   ...,  1.03  1.    1.  ]
 [ 1.    1.    1.   ...,  1.03  1.    1.  ]
 [ 1.    1.    1.   ...,  1.03  1.    1.  ]]

On top of that, trying to use p_.shape.eval() or similar yields
MissingInputError: Input 0 of the graph (indices start from 0), used to compute Elemwise{neg,no_inplace}(p_logodds__), was not provided and not given a value. Use the Theano flag exception_verbosity='high', for more information on this error.. I don’t know if this is expected here.

New thing : if I make N_ a Normal instead of a Poisson, this error goes away…

Hmm that seems odd that changing the dist of N_ makes things work.
Given that all tensors seem to be the right shape, and should be broadcastable, maybe this this a bug?
Some further debugging:

  • print the values/shape of N_ * N_bonus to confirm the error really occurs there
  • set N_ to a numpy array of the expected shape and confirm it works.
    Otherwise im at a loss. Sorry.