I am working on Aesara and I would like to know if it is possible to do inference with this library.

Let me explain:

I would like to know if from a sample of data, it is possible to return the parameters of a distribution that best fits the data.

For example: if we have a data set that we want to infer on the Pert distribution (or another distribution), Aesara is able to return the min, mode, max and lambda parameters that best describe the input data.

Typically the code would look something like below.

#Our sample data (which may or may not follow a distribution)

sample = tfd.Pert(10, 30, 40).sample(10000)

#sample = An arbitrary data set

#With PyMC/Aesara

inference(sample, Pert) would return approximately the parameters min=10, mode=30, max=40 and lambda=4

Iâ€™m not comfortable with this library (Iâ€™m starting on it) and I donâ€™t have much idea how to do this. I know that with Theano and PyMC3, this kind of thing was feasible. I think that it is probably feasible with Aesara with better performance.

Below my code,

```
import pymc as pm
from scipy.stats import beta as beta_dist
import tensorflow_probability as tfp
tfd = tfp.distributions
data = tfd.PERT(1., 50., 65.).sample(10_000)
def pert_logp_fn(observed, low, peak, high, lmb=4):
# You need to write the expression of the Pert log-density at observed, given the 3 parameters
# PyMC does not have this distribution written down
s_alpha = 1 + lmb*(peak - low)/(high-low)
s_beta = 1 + lmb*(high - peak)/(high-low)
x = ((observed - low) / (high-low))
return np.log(beta_dist.pdf(x , s_alpha, s_beta) / (high-low))
with pm.Model() as m:
# Define Flat or more informative priors for the Pert parameters
low = pm.Flat("low")
peak = pm.Flat("peak")
high = pm.Flat("high")
llike = pm.DensityDist("llike", low, peak, high, logp=pert_logp_fn, observed=data)
# Finds the most likely posterior parameters
map = pm.find_MAP()
# Takes many samples from the posterior parameters
posterior = pm.sample().posterior
```

I hope I made it clear, donâ€™t hesitate to come back to me if itâ€™s not the case.