Beginner question

Hi
I simply want to see the likelihood of an observed binomial variable of some length, where the binomial probability p_i for the different elements in the muli-binomial is given by some function.
(I want to infer lambda 1, given a certain “counts_total” vector.)

import numpy as np
import pymc3 as pm

counts_total=np.array([110,  75,  77,  63,  62])
print('shape counts_total: {}'.format(counts_total.shape))

with pm.Model() as model:
    lambda1 = pm.Normal('lambda1',10,10)
#     fitted_=np.arange(0,5)
    p=np.exp(-lambda1*(fitted_+0.5))*(np.exp(0.5*lambda1)-np.exp(-0.5*lambda1))
    p=np.ones(5)*0.1
    b = pm.Binomial('b',n=1000,p=p,shape=len(fitted_),observed=counts_total[:5])
    print(b.logp())

This gives me the error:
Missing required input: lambda1

Looks as if lambda1 was not initialized?

Not sure what you want to do here, but in short you cannot directly call b.logp() because it is not conditioned on its dependence correctly. If you want to check the model logp you can try model.logp(model.test_point).

thank you for the answer. The model logp then is the likelihood of the observed variable at the test values, right?

nope, it’s the likelihood function of all your free parameters depending on the observed value.

Hm… so, this would be the whole numerator of the RHS in the bayes theorem right? Essentially, the number that is used/compared btw. steps to decide keep or drop… (sry, beginner, and only have this basic idea).