Multivariate b-spline regression with out-of-sample predictions

Hi, I was trying to build a multivariate b-spline regression model following the indications of the cherry blossom notebook and this thread which leads to the usage of Bambi, but it is not intuitive to me in either framework and I haven’t found any examples. Is there a way to specify a multivariate (different knots and priors for each variable) b-spline regression model with Bambi or PYMC? Thank you in advance and sorry if it comes across as a naive question.

Hi @MrKevinDC, thanks for posting the question in this forum.

What do you mean with multivariate b-spline? Does it mean multiple response variables, or multiple spline basis that depend on different predictors? Do you have more context about the problem?

Hi @tcapretto, pardon me for not being clear. I have a dataset with more than one explanatory variable for my single response variable. I mean having univariate b-spline bases for each predictor in the case that there is more than one predictor. Similar to what SplineTransformer() would do in the scikit-learn package but having customised control over the number of knots and where they are located for each predictor.

1 Like

No problem at all @MrKevinDC !

You can do

model = bmb.Model("y ~ bs(x1, df=10) + bs(x2, df=5)", data)

and that would work. I used df=10 and df=5 as examples.

For the priors, it’s a little more convoluted. But this would get you started

priors = {
    "bs(x1, df=10)": bmb.Prior("Normal", mu=0, sigma=3),
    "bs(x2, df=5)": bmb.Prior("Normal", mu=0, sigma=10),
}
model = bmb.Model("y ~ bs(x1, df=10) + bs(x2, df=5)", data, priors=priors)

Notice bs() induces many vectors and you could use a different prior for each vector. What I show above uses the same prior for all the vectors of the first basis splines, and the same prior for all the vectors of the second basis splines…

Thank you so much @tcapretto, this was exactly what I was looking for. I will start working with this set up and let you know if I have any further questions.Thank you again

1 Like