Sure, this is a good place to have that discussion. The simplest way to incorporate seasonality is to include a sine or cosine wave as predictors. You may have to adjust the phase of the angle to line up with your prior assumptions, if you have any. This has the downside that it forces adjacent months’ impact to be similar to each other. If you have more data, you might want to include a dummy variable for each month as a predictor. The same strategy works for trend - just include it as another predictor. In both of these cases, the data generating process doesn’t really change in structure - we just add extra columns to the design matrix.
Since you mentioned ‘seasonal random walk’, I suspect that you want to let the effect of the seasonal component of your model vary over time. That’s the crux of the dynamic linear model framework. In this case, you can modify the above code to include extra GaussianRandomWalk components which are not directly added (as in the code I showed above) but which are multiplied by your covariates. Since the GRW is a relatively simple process, I’ve implemented it in terms of simpler increments with normal distributions below. Here, we let each coefficient become an entire vector with t autocorrelated elements. We then take the elementwise product with the predictors and sum along the covariate axis which is why we have axis=1 in the computation of mu. If you included a seasonal or monthly predictor in the covariates, this would implement a model which accounts for seasonality, though it lets the influence of seasonality vary due to the random walk prior over coefs.
with pm.Model() as model:
latent_sigma = pm.HalfNormal('latent_sigma', shape=[1,p])
coef_means = pm.Normal('coef_means',shape=p, sd=10)
coef_jumps = pm.Normal('coef_jumps',shape=[t,p],sigma=latent_sigma)
coefs = pm.Deterministic('coefs',coef_means + tt.cumsum(coef_jumps,axis=0))
mu = pm.math.sum(X*coefs,axis=1)
prob_estimated = pm.math.sigmoid(mu)
response = pm.Binomial('response', n, prob_estimated, observed=y)