# Inferred value of Pi is never 3.1415

I trying infer value of 𝛑 in simple toy example shown bellow.
But I am never get value similar to `3.1415`.

``````import warnings

import arviz as az
import matplotlib.pyplot as plt
import numpy as np
import pymc3 as pm
import scipy.stats as stats

az.style.use("arviz-darkgrid")
warnings.simplefilter(action="ignore", category=FutureWarning)

# Generate observations in range -1 ... 1
N = 5000
x = stats.uniform.rvs(size=N, loc=-1, scale=2)
y = stats.uniform.rvs(size=N, loc=-1, scale=2)
d = np.sqrt(x ** 2 + y ** 2)
# Mark values as 1 if  point (x,y) fall inside unit circle
inside_observed = (d <= 1).astype(int)

with pm.Model() as model:
pi = pm.Uniform('pi', lower=2, upper=4)

# Bernoulli distribution take value 1 with probability pi/4
inside = pm.Bernoulli('inside', p=pi / 4, observed=inside_observed)

trace = pm.sample(10000, return_inferencedata=False)

az.plot_posterior(trace)
plt.show()
``````

Here is example output of `az.plot_posterior()`:

I tried increase number of sample but it is not helps.
What I should do to get answer correct at least 3th digit after point (e.g. 3.141).

What is the unrounded mean you get?

Here is few examples:

``````trace['pi'].mean()
Out: 3.133901359395663
Out: 3.174639312890475
Out: 3.127718596098239
Out: 3.1566848089600903``````

How many samples do you need for the actual data to have a mean with 3 digits accuracy (i.e., MLE)? You should need need even more than that to overcome your prior uncertainty when doing Bayesian inference. Any reason to assume you are working with the right magnitude of data?

1 Like

Yes you are right, there is not enough data.
I noticed that `trace['pi']` value always converge close to `inside_observed.sum()/N*4`

``````inside_observed.sum()/N*4
Out: 3.132
trace['pi'].mean()
Out: 3.1315812296318453``````

That makes sense