Simple logistic model: Bad initial energy log p's -Inf and Inf

Hi all!
I am a pymc3 newbie and I think I just need an expert eye. I want to fit a basic logistic function with 2/3 free parameters.

Q1: Why does the model crash, where is it non-indentifiable?

  1. Generate data
def logistic(param,x):
    k=param[0]
    x0=param[1]
    L=param[2]
    r = (L / (1 + np.exp(-k*(x-x0))) ) 
    p = r/np.max(r)
    y = np.random.binomial(1,p)
    return p, r, y

x = np.arange(100) 
p, r, y = logistic([0.1, 50, 1], x)

# check
plt.Figure()
plt.scatter(x,y)
plt.plot(x,p)
  • L can be ignored further
  1. Define model
def sm1(x,y):
    with pm.Model() as model:
        k= pm.TruncatedNormal('k', mu=0.5, sigma=10, lower=0, upper=20)
        x0 = pm.Normal('x0', mu=1, sigma=8)
        p = 1 / (1 + np.exp(-k*(x-x0)))
        y = pm.Bernoulli('y', p=p, observed=y)
        return(model)
  1. Sample
m1 = sm1(x,y)
print(m1.basic_RVs)
with m1:
    tr = pm.sample(4000, tune=1000, cores=2)

Error

Multiprocess sampling (2 chains in 2 jobs)
NUTS: [x0, k]
Sampling 2 chains, 0 divergences:   0%|          | 0/10000 [00:00<?, ?draws/s]/home/ondrej/.conda/envs/python3.7/lib/python3.7/site-packages/numpy/core/fromnumeric.py:3373: RuntimeWarning: Mean of empty slice.
  out=out, **kwargs)
Sampling 2 chains, 0 divergences:   0%|          | 0/10000 [00:02<?, ?draws/s]
Bad initial energy, check any log probabilities that are inf or -inf, nan or very small:
y   -inf

Q2: How to scale correctly inside the model?

Here, we will also try to estimate L

2b. New model

def sm2(x,y):
    with pm.Model() as model:
        k= pm.TruncatedNormal('k', mu=0.5, sigma=10, lower=0, upper=20)
        x0 = pm.Normal('x0', mu=1, sigma=8)
        L = pm.Normal('L', mu=0, sigma=10)
        r = L / (1 + np.exp(-k*(x-x0)))
        p = r/np.max(r) #also tried with pm.Deterministic & pm.math.maximum
        y = pm.Bernoulli('y', p=p, observed=y)
        return(model)

New error

TypeError                                 Traceback (most recent call last)
<ipython-input-160-b8795030e160> in <module>
----> 1 m1 = sm1(x,y)
      2 print(m1.basic_RVs)
      3 with m1:
      4     tr = pm.sample(4000, tune=1000, cores=2)
      5     approx = pm.fit() #performs variational approximation

<ipython-input-159-e78b382a7b11> in sm1(x, y)
      7         L = pm.Normal('L', mu=0, sigma=10)
      8         r = L / (1 + np.exp(-k*(x-x0)))
----> 9         p = r/np.max(r) #also tried with pm.Deterministic
     10         y = pm.Bernoulli('y', p=p, observed=y)
     11 

<__array_function__ internals> in amax(*args, **kwargs)

~/.conda/envs/python3.7/lib/python3.7/site-packages/numpy/core/fromnumeric.py in amax(a, axis, out, keepdims, initial, where)
   2704     """
   2705     return _wrapreduction(a, np.maximum, 'max', axis, None, out,
-> 2706                           keepdims=keepdims, initial=initial, where=where)
   2707 
   2708 

~/.conda/envs/python3.7/lib/python3.7/site-packages/numpy/core/fromnumeric.py in _wrapreduction(obj, ufunc, method, axis, dtype, out, **kwargs)
     83                 return reduction(axis=axis, dtype=dtype, out=out, **passkwargs)
     84             else:
---> 85                 return reduction(axis=axis, out=out, **passkwargs)
     86 
     87     return ufunc.reduce(obj, axis, dtype, out, **passkwargs)

TypeError: max() got an unexpected keyword argument 'out'

There is already a pm.math.logit you can use that should be more stable. You can also pass in linear data directly to pm.Bernoulli('y', logit_p=...) I also would sample from the prior predictive to make sure your priors are reasonable.

1 Like

Thanks @twiecki!

  1. pm.math.logit I don’t follow how can I use it since I am trying to estimate the three parameters, 1 / (1 + exp(-x)) (pm.math.logit) vs L / (1 + np.exp(-k*(x-x0))) (what I want) - I think I am missing something obvious, right? Similar logic to using pm.Bernoulli('y', logit_p=...) :neutral_face:

  2. Thanks, I have now done that and indeed you were right one of the priors was not specified correctly, it now works after re-parametrisation :slight_smile:

2 Likes