I figured out the answer to question 2 and 3:
2. Iterating it in a for-loop like in the code should be possible, but I ran into a lot of issues regarding invalid sampling and -inf likelihoods in the binomial distribution in the for loop
3. Yes, yes it is (or at least it seems so)
My main problem now is the stochasticity of K, which I made a new thread for.
My modified code which (mostly) works:
m = [10, 7, 4]
l = len(m)
c_param = 0.7 # Parameter for a priori geometric distribution on max counter for each sheep
p_c = 0.4 # Encounter probability
# Parameters for Binomial distribution approximating the normal:
p = lambda mu, sd: 1 - sd**2/mu
n = lambda mu, sd: mu / p(mu, sd)
sd = 5 # Standard deviation to be used in said distribution
# For the sampler:
n_samples = 5_000
tune = 2_500
with pm.Model() as max_counter_model:
# Amount of sheep K:
# K = pm.Geometric("K", 1/10)
K = 25 # Fixing K as a temporary solution
p_c = pm.Beta("p_c", alpha=11.20, beta=26.13) # Assuming the encounter probability to be centered around 0.3
like = pm.Binomial("like", p=p_c, n=K, observed=np.sum(m))
# Human counter:
t = pm.Geometric("t", c_param, size=(K,)) # Max counter for each sheep
c = pm.Binomial(f"c", p=p_c, n=at.arange(1, l+1), size=(K, l)) # Counter for each sheep at each passing
for j in range(l, 1, -1): # Exclueded l=1 temporarily because it yields some error related to the binomial distribution
m_det = pm.Deterministic(f"m_det_{j}", at.sum(at.eq(c[:, j-1], t)))
like = pm.Binomial(f"like_{j}", p=p(m_det, sd), n=n(m_det, sd), observed=m[j-1])
idata = pm.sample(draws=n_samples, tune=tune)