Hello! I am very new to PyMC and bayesian modeling in general. I am currently following some of the example code used in rethinking statistics in chapter 8 and I’m trying to replicate it in PyMC but I am running into dimensionality errors when I am trying to extend my model to a new data set. I saw a post that already addressed this topic Setting new data for predictions, conflicting size with dims - Questions / version agnostic - PyMC Discourse but I am still confused on how to fix this issue. Any clarification would be appreciated. Here is my model set up and the code used to run the posterior predictive:
continent_labels, continent = pd.factorize(df_standard.cont_africa)
coord = {
"features": ["rugged_std"],
"obs_id": np.arange(df_standard.shape[0]),
"continent": continent.values
}
with pm.Model(coords=coord) as m8_3:
rugged_std = pm.Data("rugged_std", df_standard.rugged_std.values, dims="obs_id")
continent_indx = pm.Data("continent_indx", continent_labels, dims="obs_id")
# priors
alpha = pm.Normal("alpha", mu=1, sigma=0.1, dims="continent")
beta = pm.Normal("beta", mu = 0, sigma=0.3, dims = "features")
sigma = pm.Exponential("sigma", 1)
# Determenistic
mu = pm.Deterministic("mu", alpha[continent_indx] + (rugged_std - rugged_std.mean())* beta[0], dims="obs_id")
# Liklelihood
y = pm.Normal("y", mu=mu, sigma=sigma, observed=df_standard.log_gdp_std ,dims="obs_id")
with m8_3:
idata3 = pm.sample_prior_predictive(draws=100)
idata3 = pm.sample(idata_kwargs={"log_likelihood": True})
idata3.extend(pm.sample_posterior_predictive(idata3))
I am getting the error at this code chunck:
rugged_seq = np.linspace(-0.1, 1.1, 30)
continent_pred = np.repeat(0, len(rugged_seq))
with m8_3:
pm.set_data({
"rugged_std": rugged_seq,
"continent_indx": continent_pred
}, coords = {"obs_id": np.arange(rugged_seq.shape[0])})
mu_pred = pm.sample_posterior_predictive(idata3, var_names=["mu"])
With this error message:
ValueError: conflicting sizes for dimension 'obs_id': length 170 on the data but length 30 on coordinate 'obs_id'
Thanks for any help!