What's the best way to implement an autoregressive moving average of a time series with a variable lag and window size?

Are all values of c_mu strictly positive at the starting point?

Yep! I opened up the debugger after loading the data:

(Pdb++) np.all(data['c'] >= 0)
True

What does the output of model.debug() look like?


point={'c0': array(500), 'alpha_log__': array(2.30258509), 'beta_log__': array(2.30258509), 'gamma_log__': array(2.30258509), 'width_0': array(100), 'width_1': array(100), 'lag_0': array(362), 'lag_1': array(730), 'lag_2': array(1095)}

The variable c_likelihood has the following parameters:
0: Alloc [id A] <Vector(float64, shape=(?,))> 'c_mu'
 ├─ Sub [id B] <Vector(float64, shape=(1,))>
 │  ├─ Add [id C] <Vector(float64, shape=(1,))>
 │  │  ├─ ExpandDims{axis=0} [id D] <Vector(int64, shape=(1,))>
 │  │  │  └─ c0 [id E] <Scalar(int64, shape=())>
 │  │  ├─ Mul [id F] <Vector(float64, shape=(1,))>
 │  │  │  ├─ [nan] [id G] <Vector(float64, shape=(1,))>
 │  │  │  └─ Exp [id H] <Vector(float64, shape=(1,))>
 │  │  │     └─ ExpandDims{axis=0} [id I] <Vector(float64, shape=(1,))>
 │  │  │        └─ alpha_log__ [id J] <Scalar(float64, shape=())>
 │  │  └─ Mul [id K] <Vector(float64, shape=(1,))>
 │  │     ├─ [nan] [id G] <Vector(float64, shape=(1,))>
 │  │     └─ Exp [id L] <Vector(float64, shape=(1,))>
 │  │        └─ ExpandDims{axis=0} [id M] <Vector(float64, shape=(1,))>
 │  │           └─ beta_log__ [id N] <Scalar(float64, shape=())>
 │  └─ Mul [id O] <Vector(float64, shape=(1,))>
 │     ├─ [nan] [id G] <Vector(float64, shape=(1,))>
 │     └─ Exp [id P] <Vector(float64, shape=(1,))>
 │        └─ ExpandDims{axis=0} [id Q] <Vector(float64, shape=(1,))>
 │           └─ gamma_log__ [id R] <Scalar(float64, shape=())>
 └─ Shape_i{0} [id S] <Scalar(int64, shape=())>
    └─ t_data [id T] <Vector(float64, shape=(?,))>
The parameters evaluate to:
0: [nan nan nan ... nan nan nan]
This does not respect one of the following constraints: mu >= 0

Here’s the model graph: