Ok thanks for the feedback. But I got this error
ValueError Traceback (most recent call last)
Cell In[106], line 2
1 with model:
----> 2 trace = pm.sample(1000, tune=1500, cores = 4)
...
456 )
458 # inplace_pattern maps output idx -> input idx
459 inplace_pattern = self.inplace_pattern
ValueError: Incompatible Elemwise input shapes [(1, 380), (1, 5)]
Could be the outputsβs shape of the function bivpoiss_logp
but i canβt do .eval().shape
inside of the logp function to check the shape because this error pops up
MissingInputError: Input 0 (def_star) of the graph (indices start from 0), used to compute Sum{axes=None}(def_star), was not provided and not given a value. Use the PyTensor flag exception_verbosity='high', for more information on this error.
Can you give me any advice? please
Additionally I run model.debug(verbose = True)
and I got this:
point={'home': array(0.), 'mu_att': array(0.), 'mu_def': array(0.), 'tau_att_log__': array(0.), 'tau_def_log__': array(0.), 'beta_con': array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0., 0., 0., 0.]), 'att_star': array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0.]), 'def_star': array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
0., 0., 0.])}
The variable goals_per_match has the following parameters:
0: Exp [id A] <Matrix(float64, shape=(1, ?))>
ββ Add [id B] <Matrix(float64, shape=(1, ?))>
ββ ExpandDims{axes=[0, 1]} [id C] <Matrix(float64, shape=(1, 1))>
β ββ home [id D] <Scalar(float64, shape=())>
ββ ExpandDims{axis=0} [id E] <Matrix(float64, shape=(1, ?))>
β ββ AdvancedSubtensor1 [id F] <Vector(float64, shape=(380,))>
β ββ Sub [id G] <Vector(float64, shape=(?,))> 'att'
β β ββ att_star [id H] <Vector(float64, shape=(?,))>
β β ββ True_div [id I] <Vector(float64, shape=(1,))>
β β ββ ExpandDims{axis=0} [id J] <Vector(float64, shape=(1,))>
β β β ββ Sum{axes=None} [id K] <Scalar(float64, shape=())>
β β β ββ att_star [id H] <Vector(float64, shape=(?,))>
β β ββ Cast{float64} [id L] <Vector(float64, shape=(1,))>
β β ββ MakeVector{dtype='int64'} [id M] <Vector(int64, shape=(1,))>
β β ββ Shape_i{0} [id N] <Scalar(int64, shape=())>
β β ββ att_star [id H] <Vector(float64, shape=(?,))>
β ββ [ 7 2 8 ... 5 13 11] [id O] <Vector(uint8, shape=(380,))>
ββ ExpandDims{axis=0} [id P] <Matrix(float64, shape=(1, ?))>
ββ AdvancedSubtensor1 [id Q] <Vector(float64, shape=(380,))>
ββ Sub [id R] <Vector(float64, shape=(?,))> 'def'
β ββ def_star [id S] <Vector(float64, shape=(?,))>
β ββ True_div [id T] <Vector(float64, shape=(1,))>
β ββ ExpandDims{axis=0} [id U] <Vector(float64, shape=(1,))>
β β ββ Sum{axes=None} [id V] <Scalar(float64, shape=())>
β β ββ def_star [id S] <Vector(float64, shape=(?,))>
β ββ Cast{float64} [id W] <Vector(float64, shape=(1,))>
β ββ MakeVector{dtype='int64'} [id X] <Vector(int64, shape=(1,))>
β ββ Shape_i{0} [id Y] <Scalar(int64, shape=())>
β ββ def_star [id S] <Vector(float64, shape=(?,))>
ββ [ 0 1 6 ... 18 9 10] [id Z] <Vector(uint8, shape=(380,))>
1: Exp [id BA] <Matrix(float64, shape=(1, ?))>
ββ Add [id BB] <Matrix(float64, shape=(1, ?))>
ββ ExpandDims{axis=0} [id BC] <Matrix(float64, shape=(1, ?))>
β ββ AdvancedSubtensor1 [id BD] <Vector(float64, shape=(380,))>
β ββ Sub [id G] <Vector(float64, shape=(?,))> 'att'
β β ββ Β·Β·Β·
β ββ [ 0 1 6 ... 18 9 10] [id Z] <Vector(uint8, shape=(380,))>
ββ ExpandDims{axis=0} [id BE] <Matrix(float64, shape=(1, ?))>
ββ AdvancedSubtensor1 [id BF] <Vector(float64, shape=(380,))>
ββ Sub [id R] <Vector(float64, shape=(?,))> 'def'
β ββ Β·Β·Β·
ββ [ 7 2 8 ... 5 13 11] [id O] <Vector(uint8, shape=(380,))>
2: Exp [id BG] <Matrix(float64, shape=(1, ?))>
ββ ExpandDims{axis=0} [id BH] <Matrix(float64, shape=(1, ?))>
ββ beta_con [id BI] <Vector(float64, shape=(?,))>
The parameters evaluate to:
0: [[1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]]
1: [[1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]]
2: [[1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.
1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]]
The variable goals_per_match logp method raised the following exception: Input dimension mismatch: (input[%i].shape[%i] = %lld, input[%i].shape[%i] = %lld)
Input dimension mismatch: (input[%i].shape[%i] = %lld, input[%i].shape[%i] = %lld)
Apply node that caused the error: Composite{((i2 / (i0 * i1)) ** i3)}(ExpandDims{axis=1}.0, ExpandDims{axis=1}.0, ExpandDims{axis=1}.0, [[[0 1 2 3 4]]])
Toposort index: 22
Inputs types: [TensorType(float64, shape=(1, 1, None)), TensorType(float64, shape=(1, 1, None)), TensorType(float64, shape=(1, 1, None)), TensorType(int64, shape=(1, 1, 5))]
Inputs shapes: [(1, 1, 380), (1, 1, 380), (1, 1, 380), (1, 1, 5)]
Inputs strides: [(3040, 3040, 8), (3040, 3040, 8), (3040, 3040, 8), (40, 40, 8)]
Inputs values: ['not shown', 'not shown', 'not shown', array([[[0, 1, 2, 3, 4]]])]
Outputs clients: [[Composite{((i0 * i1 * i2) / i3)}([[[1.0000e ... 000e+01]]], [[[ 2.] ... [ 24.]]], Composite{((i2 / (i0 * i1)) ** i3)}.0, [[[ 2. i ... 6. 24.]]])]]
HINT: Re-running with most PyTensor optimizations disabled could provide a back-trace showing when this node was created. This can be done by setting the PyTensor flag 'optimizer=fast_compile'. If that does not work, PyTensor optimizations can be disabled with 'optimizer=None'.
HINT: Use the PyTensor flag `exception_verbosity=high` for a debug print-out and storage map footprint of this Apply node.
I think I know whatβs going on, when I implented the vectorize form I was thinking of numpy arrays so when I compute lam ** indexes
with lam
βs shape (1,n) and indexes
βs shape (m,1) Iβve end up with lam ** indexes
's shape (m,n) and like the i,j element is lam[i] ** indexes[j]. It is possible with TensorVariables?