How to model non-informative priors in PyMC3

Its a bit confusing,
Task–> i want to model “Non-informative priors”

Should i use pm.Flat or pm.Uniform
Which one is correct (both in mathematical sense and implementation)

#considering priors as Uniform

mu = pm.Uniform(‘mu’, lower = some value , upper= some value )
beta=pm.Uniform(‘beta’, lower = some value , upper= some value )

#considering prior as Flat

mu = pm.Flat('mu')
beta=pm.Flat('beta')

# Define the likelihood function.
y_obs = pm.Gumbel('y_obs', mu=mu, beta=beta, observed=annual_maxima_1)

Best regards

1 Like

As far as I can tell, the probability density of Uniform is 1/width, making the integral one. Flat returns 0 for the log probability density. If you’re after a proper probability density (for a continuous probability density function), I’d opt for Uniform.

But I might have missed an important detail…

1 Like

@RavinKumar In what situations should the pymc Flat and Uniform distributions be used? The documentation doesn’t suggest use-cases.

Uniform requires you to specify lower and upper bounds, whereas Flat does not. As mentioned above Uniform is a proper distribution and Flat is improper (its integral is not 1)

1 Like

Thank you ricardo! Can you provide an example use case for the two?
I understand that a Flat distribution is a special case of the Uniform distribution where the limits of the distribution are infinite. I can think of examples for uniform but when would we something with infinite bounds to solve a real-world problem? Or is it used exclusively for theoretical analysis?