Pymc5 out of sample

Hi Jesse! Thank you very much and really usell. It works!! :slight_smile:
Your answer “gives birth to” two different topic:

  • is there any chance in the prediction part of the code to get the predicted value of my features (not only y_hat? I am looking at the idata object but I find only y_yat

  • as you mentioned, this model is quite easy, but if I would like to incorporate different functions, what is your suggestion.

Please find below a more complex code where I would like to apply predictions

def geometric_adstock_tt(x_t, alpha=0, L=21, normalize=True):
    w = tt.as_tensor_variable(
        [tt.power(alpha, i) for i in range(L)]
    )
    
    xx = tt.stack(
        [tt.concatenate([
            tt.zeros(i),
            x_t[:x_t.shape[0]-i]
        ]) for i in range(L)]
    )
    
    if not normalize:
        y=tt.dot(w, xx)
    else:
        y=tt.dot(w/tt.sum(w),xx)
    
    return y


basic_model =pm.Model()

with basic_model:
    response_mean = []
    for feature in features:
        xx = df_newfeature
        beta = pm.HalfNormal(f'beta_{feature}', sigma = 2)
        decay = pm.Beta(f'decay_{feature }', alpha=3, beta=3)
        contribution = pm.Deterministic(f'contribution_{feature}', (geometric_adstock_tt(xx, decay))*beta)
        response_mean.append(contribution)
        
        
    
    if control_vars:
        for control in control_vars:
            x = df_new[control].values
            print (f'Adding control: {control}')
            control_beta = pm.Normal(f'coeff_{control}', sigma = 2)
            control_contribution = pm.Deterministic(f'contribution_{control}', (control_beta * x))
            response_mean.append(control_contribution)
            
    
    sigma = pm.HalfNormal('sigma', sigma=1)
    
    likelihood =  pm.Normal("likelihood", mu=sum(response_mean), sigma=sigma, observed=target)

do you suggest to continue to not loop over features even if now I have a tensor and different calculation among different types of features?

any suggestion is really appreciate. :slight_smile:
thank you!