Just testing with 2 arrays of y (integers) and id (0,1,2,3,4,…,999). Nothing special. I still cannot get the same values for the parameters alpha, beta, theta. They differ a lot.
The likelihoods are equivalent if you shift by -1 as shown in this snippet
import pymc as pm
with pm.Model() as m:
p = pm.Beta("p", 10, 100, size=3)
pm.Geometric("x1", p=p, observed=[1,2,5])
pm.NegativeBinomial("x2", n=1, p=p, observed=[0,1,4])
m.point_logps()
# {'p': 0.53, 'x1': -7.67, 'x2': -7.67}
If you are not getting the same results even after shifting it means the model is not converging. Given the flat priors and one beta per observation I wouldn’t be too surprised if this were the case.