Dear ,
I’ve been trying implement learning model in PyMC3.
I have a following problem.
I have a prior distribution of three inputs:
x=pm.Normal (‘x’,mu = 0.06, sd = 0.1)
y=pm.Normal (‘y’, mu = 1, sd = 0.05)
z = pm.Normal (‘z’,mu=50,sd = 5)
and I have a surrogate model of three inputs is f(x,y,z) and the observed f(x,y,z)
f(x,y,z) = 3x^2+6xy+5y^2+2yz+5*z^2+21
how to apply Bayesian inference to find the posterior of multivariable?
thanks you
I suggest you assume your observations are generated as
Obs = f(x,y,z) + \varepsilon
where \varepsilon is taken to be gaussian with zero mean and \sigma standard deviation (like what is done in ordinary least squares). This leaves you with the following:
with pm.Model() as model:
x=pm.Normal('x', mu=0.06, sigma=0.1)
y=pm.Normal('y', mu=1, sigma=0.05)
z = pm.Normal('z', mu=50, sigma=5)
f = (3 * x**2 + 6 * x * y + 5 * y**2 +
2*y*z + 5*z**2 + 21)
sigma = pm.HalfNormal('sigma', 1)
obs = pm.Normal('obs', mu=f, sigma=sigma, observed=data)
# Now you draw samples from the posterior
trace = pm.sample()
pm.plot_posterior(trace)
1 Like