Why does my new data with length of 12, give me 48 posterior predictive observations?

Thank you. When I use

yearly_seasonality = pm.Deterministic('yearly_seasonality',at.dot(yearly_X(t_, 365.25/t_.shape[0]), yearly_beta))

I get the following error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
/tmp/ipykernel_6117/3787808774.py in <module>
     16 
     17     yearly_beta = pm.Normal('yearly_beta', 0, 1, shape = n_components*2)
---> 18     yearly_seasonality = pm.Deterministic('yearly_seasonality',at.dot(yearly_X(t_, 365.25/t_.shape[0]), yearly_beta))
     19 
     20     monthly_beta = pm.Normal('monthly_beta', 0, 1, shape = monthly_n_components*2)

/tmp/ipykernel_6117/3136029572.py in yearly_X(t, p, n)
      9 def yearly_X(t, p=365.25, n=n_components):
     10     x = 2 * np.pi * (np.arange(n)+1) * t[:, None] / p
---> 11     return np.concatenate((np.cos(x), np.sin(x)), axis = 1)
     12 
     13 monthly_n_components = 5

<__array_function__ internals> in concatenate(*args, **kwargs)

ValueError: zero-dimensional arrays cannot be concatenated

Is this because of the function yearly_X which is actually:

def yearly_X(t, p=365.25, n=n_components):
    x = 2 * np.pi * (np.arange(n)+1) * t[:, None] / p
    return np.concatenate((np.cos(x), np.sin(x)), axis = 1)

Should that be a function written in aesara for it to work with shared tensor variables?

When I used the np_array t, it was an array of shape(48,). But when I change t to a shared variable inside the model, the function reads it as a zero dimension…