Is there a way to include a source term in an AR1 model?
x(n+1) = rho_1 x(n) + y(n), where y(n) is the driving term/array that is known a-priori.
Thanks.
Is there a way to include a source term in an AR1 model?
x(n+1) = rho_1 x(n) + y(n), where y(n) is the driving term/array that is known a-priori.
Thanks.
Depends, if the AR is latent, you can add the y(n) to the RV AR += y
. If the AR is observed, you can do observed=data-y
Awesome. Thanks !!
Actually, I think that adding y
to x
is not the same as adding a drive to the AR1 model. Consider the recurrence:
x(n+1) = \rho_1 x(n) + y(n) = \rho_1 (\rho_1 x(n-1) + y(n-1)) + y(n)
x(n+1) = \rho_1^{2} x(n-1) + \rho_1 y(n-1) + y(n)
If you had the process \tilde{x} with no driving term:
\tilde{x}(n+1) = \rho_1 \tilde{x}(n) = \rho_1^{2}\tilde{x}(n-1)
Then \tilde{x}(n+1) + y(n) = \rho_1 \tilde{x}(n) + y(n) = \rho_1^{2}\tilde{x}(n-1) + y(n)\neq x(n+1) because you don’t get the recurrent \rho_1 y(n-...) terms.
I suggest you use the AR
distribution instead of the AR1
, because it allows you to add in the driving term \rho_0.
You are right, good point.
Good point. Can rho0 and rho1 have different shapes? rho0 will be a vector and rho1 a scalar for the driving term case. Does rho1 also need to be specified as a constant vector?
Yes, you can supply a list of tensors to AR
as:
pm.AR('name', rho=[rho0, rho1], constant=True, shape=...)
where rho0
and rho1
can be shared, constants, numpy arrays, scalars or random variables. You have to take care that the shape of the AR
has to match or at least broadcast with rho0
.