How do I correctly structure a multi-dimensional variable in a regression problem

Error is:

ValueError                                Traceback (most recent call last)
/opt/conda/lib/python3.7/site-packages/aesara/compile/function/types.py in __call__(self, *args, **kwargs)
    975                 self.vm()
--> 976                 if output_subset is None
    977                 else self.vm(output_subset=output_subset)

ValueError: Input dimension mismatch. One other input has shape[0] = 6, but input[4].shape[0] = 26.

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
/tmp/ipykernel_26145/3292791440.py in <module>
    198 
    199 
--> 200 run_korea(df)

/tmp/ipykernel_26145/3292791440.py in run_korea(df)
    149         loc_ic = utility_functions.make_next_level_hierarchy_variable(name='loc_ic', mu=loc_subcat, alpha=2, beta=1, dims=['ic', 'location'])
    150         loc_item = utility_functions.make_next_level_hierarchy_variable(name='loc_item', mu=loc_ic, alpha=2, beta=1, dims=['item', 'location'])
--> 151         loc_item.eval()
    152 
    153 

/opt/conda/lib/python3.7/site-packages/aesara/graph/basic.py in eval(self, inputs_to_values)
    600         args = [inputs_to_values[param] for param in inputs]
    601 
--> 602         rval = self._fn_cache[inputs](*args)
    603 
    604         return rval

/opt/conda/lib/python3.7/site-packages/aesara/compile/function/types.py in __call__(self, *args, **kwargs)
    990                     node=self.vm.nodes[self.vm.position_of_error],
    991                     thunk=thunk,
--> 992                     storage_map=getattr(self.vm, "storage_map", None),
    993                 )
    994             else:

/opt/conda/lib/python3.7/site-packages/aesara/link/utils.py in raise_with_op(fgraph, node, thunk, exc_info, storage_map)
    532         # Some exception need extra parameter in inputs. So forget the
    533         # extra long error message in that case.
--> 534     raise exc_value.with_traceback(exc_trace)
    535 
    536 

/opt/conda/lib/python3.7/site-packages/aesara/compile/function/types.py in __call__(self, *args, **kwargs)
    974             outputs = (
    975                 self.vm()
--> 976                 if output_subset is None
    977                 else self.vm(output_subset=output_subset)
    978             )

ValueError: Input dimension mismatch. One other input has shape[0] = 6, but input[4].shape[0] = 26.
Apply node that caused the error: Elemwise{Composite{(i0 + (i1 * i2) + (i3 * i4) + (i5 * i6) + (i7 * i8) + (i9 * i10))}}[(0, 2)](InplaceDimShuffle{x,0}.0, InplaceDimShuffle{x,x}.0, loc_bl_offset, InplaceDimShuffle{x,x}.0, loc_cat_offset, InplaceDimShuffle{x,x}.0, loc_subcat_offset, InplaceDimShuffle{x,x}.0, loc_ic_offset, InplaceDimShuffle{x,x}.0, loc_item_offset)
Toposort index: 17
Inputs types: [TensorType(float64, (1, None)), TensorType(float64, (1, 1)), TensorType(float64, (None, None)), TensorType(float64, (1, 1)), TensorType(float64, (None, None)), TensorType(float64, (1, 1)), TensorType(float64, (None, None)), TensorType(float64, (1, 1)), TensorType(float64, (None, None)), TensorType(float64, (1, 1)), TensorType(float64, (None, None))]
Inputs shapes: [(1, 2), (1, 1), (6, 2), (1, 1), (26, 2), (1, 1), (102, 2), (1, 1), (191, 2), (1, 1), (545, 2)]
Inputs strides: [(16, 8), (8, 8), (16, 8), (8, 8), (16, 8), (8, 8), (16, 8), (8, 8), (16, 8), (8, 8), (16, 8)]
Inputs values: [array([[ 2995.44591201, -1565.36841134]]), array([[0.42405651]]), 'not shown', array([[2.14519186]]), 'not shown', array([[1.2296369]]), 'not shown', array([[2.71865771]]), 'not shown', array([[0.48200603]]), 'not shown']
Outputs clients: [['output']]

HINT: Re-running with most Aesara optimizations disabled could provide a back-trace showing when this node was created. This can be done by setting the Aesara flag 'optimizer=fast_compile'. If that does not work, Aesara optimizations can be disabled with 'optimizer=None'.
HINT: Use the Aesara flag `exception_verbosity=high` for a debug print-out and storage map footprint of this Apply node.

The model eval() returns the following:

{'mu_intercept': (),
 'bl_intercept_sigma_log__': (),
 'bl_intercept_sigma': (),
 'bl_intercept_offset': (6,),
 'cat_intercept_sigma_log__': (),
 'cat_intercept_sigma': (),
 'cat_intercept_offset': (26,),
 'subcat_intercept_sigma_log__': (),
 'subcat_intercept_sigma': (),
 'subcat_intercept_offset': (102,),
 'ic_intercept_sigma_log__': (),
 'ic_intercept_sigma': (),
 'ic_intercept_offset': (191,),
 'item_intercept_sigma_log__': (),
 'item_intercept_sigma': (),
 'item_intercept_offset': (545,),
 'mu_can': (),
 'bl_cann_sigma_log__': (),
 'bl_cann_sigma': (),
 'bl_cann_offset': (6,),
 'cat_cann_sigma_log__': (),
 'cat_cann_sigma': (),
 'cat_cann_offset': (26,),
 'subcat_cann_sigma_log__': (),
 'subcat_cann_sigma': (),
 'subcat_cann_offset': (102,),
 'ic_cann_sigma_log__': (),
 'ic_cann_sigma': (),
 'ic_cann_offset': (191,),
 'item_cann_sigma_log__': (),
 'item_cann_sigma': (),
 'item_cann_offset': (545,),
 'loc_intercept': (2,),
 'loc_bl_sigma_log__': (),
 'loc_bl_sigma': (),
 'loc_bl_offset': (6, 2),
 'loc_cat_sigma_log__': (),
 'loc_cat_sigma': (),
 'loc_cat_offset': (26, 2),
 'loc_subcat_sigma_log__': (),
 'loc_subcat_sigma': (),
 'loc_subcat_offset': (102, 2),
 'loc_ic_sigma_log__': (),
 'loc_ic_sigma': (),
 'loc_ic_offset': (191, 2),
 'loc_item_sigma_log__': (),
 'loc_item_sigma': (),
 'loc_item_offset': (545, 2)}

This worked when I tried to add the old seasonality variables you helped me with in the pat but I was multiplying those RVs by the Fourier transformation at the end of the RV sampling.

It did compile and sample before I tried to add my marketing hierarchy to the loc_intercept. But I’m still not getting what I want and know our locations see different sales for different items.