i currently try to reproduce the hierarchical model from Chapados, Nicolas. “Effective Bayesian modeling of groups of related count time series.” arXiv preprint arXiv:1405.3738 (2014)..
In the paper they show an approximate inference method but also state, that they where able to get the posteriors by using Stan. I wanted to achieve the same with PyMC3. Right now I have the following model.
with pm.Model() as ar_model_hier: # Global Hyperpriors alpha = pm.Uniform("alpha", lower=0.001, upper=0.1) kappa_tau = pm.Uniform("kappa_tau", lower=5, upper=10) beta_tau = pm.Uniform("beta_tau", lower=2, upper=25) mu_mu = pm.Normal("mu_mu", mu=0, sd=2**2) tau_mu = pm.Uniform("tau_mu", lower=1, upper=10) phi_plus = pm.Uniform("phi_plus", lower=1, upper=600) phi_minus = pm.Uniform("phi_minus", lower=1, upper=50) # AR part of the model mu_l = pm.Normal("mu_l", mu_mu, 1. / tau_mu, shape=Y_hat.shape) phi_l = pm.Beta("phi_l", phi_plus + phi_minus, phi_minus, shape=Y_hat.shape) c_l = mu_l * (1 - phi_l) tau_l = pm.Gamma("tau_l", kappa_tau, beta_tau, shape=Y_hat.shape) # Binomial part of the model alpha_l = pm.Exponential("alpha_l", alpha, shape=Y_hat.shape) for i in range(Y_hat.shape): y_hat = Y_hat[:, i] eta_l = pm.AR("eta_%d" % i, rho=[c_l[i], phi_l[i]], tau=tau_l[i], constant=True, shape=len(y_hat)) y_t_l = pm.NegativeBinomial("y_t_%d" % i, tt.exp(eta_l), alpha_l[i], observed=y_hat)
Y_hat is a matrix, with observed counts in rows, and different series in columns. The above model works great as long as I just have ~ 10 columns. The datasets from the paper actually have > 1000 column/series. So currently I also tried to get rid of the for loop, for the autoregressive model, but was not successful so far.
Are there any examples for a vectorized autoregressive model?