# Is This The Proper Way to Use Observed RVs of Different Lengths?

I know how to handle linear models where the observed data is of different lengths, but I’m not sure how to handle the same use-case when multiplying observed parameters of differing lengths. My only guess was to combine the observed parameters into a single tensor via tt.tensor.join. This runs, but I wanted to make sure this was the correct way to go about it.

with pm.Model() as model:
a = pm.Normal('a', 0, 10)
b = pm.Normal('b', 0, 10, shape=2)

idx = [0,0,0,0,0,1,1,1,1,1,1]

obs0 = pm.Normal('y0', 0, 10, observed=[1, 2, 1, 2, 3])
obs1 = pm.Normal('y1', 0, 10, observed=[1, 2, 1, 2, 5, 6])

y = a + b[idx] * tt.tensor.join(0, obs0, obs1)


Thank you

Not sure I understand your use case of “observed parameters”. The code is fine but it will be the same as if you are doing:

    obs0 = np.asarray([1, 2, 1, 2, 3])
obs1 = np.asarray([1, 2, 1, 2, 5, 6])


Not sure I understand your use case of “observed parameters ”. The code is fine but it will be the same as if you are doing:

Thank you for the reply. So in the above case, the Normal distribution isn’t being used at all? I want to setup a linear model:

z = \alpha + Y \cdot \beta^T where Y is, for example, 2 Normal distributions (containing different length observations).

I.e. [\mathcal{N}_0, \mathcal{N}_1] \cdot [\beta_0, \beta_1]^T.

The model logp will includes two more constants from the normal, but because they are constants MCMC will essentially ignore it.

To do what you intended, you need to find a way to set up two latent variables, and let the information in the observed Y propagate back to these said latent variables, something like:

with pm.Model() as model:
a = pm.Normal('a', 0, 10)
b = pm.Normal('b', 0, 10, shape=2)

idx = [0,0,0,0,0,1,1,1,1,1,1]

ymu = pm.Normal('mu', 0, 10, shape=2)
obs0 = pm.Normal('y0', ymu, 1., observed=[1, 2, 1, 2, 3])
obs1 = pm.Normal('y1', ymu, 1., observed=[1, 2, 1, 2, 5, 6])

y = a + b[idx] * ymu[idx]

1 Like

Ah that makes sense. Thank you very much @junpenglao — much appreciated!

1 Like