Hi, guys
I am trying to make a model for epidemic growth these days and i got some problem…
at first, my model was based on simple logistic growth model. and i found a reference using bayesian gamma GLM so i tried that way but actually confused about model…
ref : p40-p42 https://www.universiteitleiden.nl/binaries/content/assets/science/mi/scripties/statscience/2018-2019/thesis_bonneville_s1914944.pdf
through that way, i made model setting all the priors with simple non-informative ones just for trial.but it’s not working with ‘bad initial energy error’
with pm.Model() as model:
b0 = pm.Normal('b0',mu=0, sigma=10)
b1 = pm.HalfNormal('b1', sigma=10)
b2 = - pm.HalfNormal('b2', sigma=10)
sig = 2.5
theta = (b0 + b1*data['c']+b2*data['c^2'])
mu_ = 1/theta
y = pm.Gamma('y', mu = mu_, sigma = sig, observed=data['dc'].values)
i am a newbie of bayesian,PYMC so any kind of help would be really appreciated. thanks!
Hi,
It can stem from several reasons, but the more usual ones are:
-
Too wide priors – did you try with more informative priors? Non-informative priors are usually a bad idea.
-
Non-standardized predictors – I see you have a squared predictor; if it’s not standardized, the values can get huge and yield too complex of a geometry for the sampler.
-
Missing values, zeros or out-of-domain data points somewhere.
You’ll find more information here.
Hope this helps
1 Like
Look’s like your theta can go negative if b2>>b1 no matter what your values of c^2 and c. This in turn causes mu_ to be negative, feeding that into pm.Gamma() will produce bad initial energy error.
One simple strategy is to wrap theta in tt.exp() so it is strictly positive. If you do that then all your priors are in log space and HalfNormal(sigma=10) will be much too wide. It should run without error but won’t sample efficiently.
3 Likes
thank!
I fixed priors and theta. and now it works!
it was big help. !!
1 Like