Difference in using theano.tensor.stack and aesara.tensor.stack

Hello, I am a PyMC newbie and recently following Bayesian programming examples. There is a problem using aesera.tensor.

When I use theano.tensor in PyMC3,

with pm.Model() as model:
p1 = pm.Uniform(‘p’, 0, 1)
p2 = 1 - p1
p = T.stack([p1, p2])
print(type(p))
print(np.shape(p))

The result is:

<class ‘theano.tensor.var.TensorVariable’>
Shape.0

This does not have a problem for Metropolis.

On the other hand, when I use aesara.tensor…

with pm.Model() as model:
p1 = pm.Uniform(‘p’, 0, 1)
p2 = 1 - p1
p = at.stack([p1, p2]) # Vectorize p1 and p2.
print(type(p))
print(np.shape(p))

The results is:

<class ‘aesara.tensor.var.TensorVariable’>
TensorConstant{(1,) of 2}

So, in the case of aesara.tensor.stack, variable ‘p’ could not be used in Metropolis.

with model:
step1 = pm.Metropolis(vars=[p, sds, centers])
step2 = pm.CategoricalGibbsMetropolis(vars=[assignment])
trace = pm.sample(25000, step=[step1, step2])

ValueError: need at least one array to concatenate

Could you provide some solutions for this problem in using aesara.tensor.stack?

Welcome!

The first thing I would suggest is to avoid custom steps whenever possible. pm.sample() will happily (and effectively) select steps on your behalf based on the model you build. So unless you have some unique case in which you want to override the pymc-inferred steps, I would just call pm.sample(25000).

That being said, you may be able to induce some shape information by specifying p1 = pm.Uniform(‘p’, 0, 1, shape=1). @ricardoV94 may be able to suggest a conventional solution.

Thanks a lot. Don’t I need to distinguish categorical variables from the other variables to run MCMC?

Nope. PyMC knows what parameters are continuous/discrete. pm.sample() will figure it all out for you.

1 Like

Thank you so much!!!