Hello pymc community,
I have learned how to use pymc mostly through linear regression examples. However, I am dealing with a new problem which essentially is looking to find a distribution of posterior for 3 unknown parameters.
Here I know x1, x2, x0
, and I am hoping to find the distribution for C
an F
parameters for any value of dp
(C
, F
and dp
are unknown).
The formula that relates all these parameters is:
dp = x1 - x2 - (x0/F) + (C / F)
The Bayes theorem would be, what is the probability of seeing any combination of C
, F
and dp
, given we observed a combination of x1, x2, x0
:
P(C, F, dp | x0, x1, x2) = [ P(x0, x1, x2 | C, F, dp) x P(C, F, dp) ] / P(x0, x1, x2)
This might be really silly (given I learned the pymc with linear regression), I changed equation relating the parameters to y = x1 - x2 - (x0/F) + (C / F) - dp, and then defined a vector of y with values of 0 with a small standard deviation (e.g., 0.001). This might be really silly though. The other way around would be to fix dp
(e.g., dp = np.zeros(len(P_i)) + 1) and replace it with y value. But this is tedious as I have to write a for-loop to go and fix the dp
value which will take days to run and only a few dp
values will be explored.
My questions to the community are:
-
Is it actually a problem to solve using the Bayes theorem? My though is: I hope so because the uncertainties in
x0, x1, x2
should be taken into account and Bayes is the tool that seems to be they way when dealing with uncertainties and credible intervals. -
Is there something wrong with my approach in formulating the problem as P(C, F, dp | x0, x1, x2) = [ P(x0, x1, x2 | C, F, dp) x P(C, F, dp) ] / P(x0, x1, x2)? I wrote the code below to try to find the most probable combination of
C, F, dp
for each observedx0, x1, x2
? -
Is there any similar example (notebook) so that I could modify my current code to find a solution for the problem?
-
Do you have any suggestions how to modify the code so that it could reach the goal of finding the combination of
C, F, dp
for each observed combination ofx0, x1, x2
?
Many thanks in advance!
#### Toy data
x0 = np.random.uniform(low=5, high=61, size=(30,))
x1 = np.random.uniform(low=25, high=145, size=(30,))
x2 = np.random.uniform(low=20, high=65, size=(30,))
y = P_i * 0
with pm.Model() as model_min_press_const_fric:
### likelihoods
T_final = pm.MutableData("T_", x0)
avg_x_noise_0 = pm.HalfNormal("avg_x_noise_0")
x_noisy_0 = pm.Normal("x_noisy_0", mu=T_final, sigma=avg_x_noise_0, shape=T_final.shape)
S_final = pm.MutableData("S", x1)
avg_x_noise_1 = pm.HalfNormal("avg_x_noise_1")
x_noisy_1 = pm.Normal("x_noisy_1", mu=S_final, sigma=avg_x_noise_1, shape=S_final.shape)
P_i_final = pm.MutableData("P_i", x2)
avg_x_noise_2 = pm.HalfNormal("avg_x_noise_2")
x_noisy_2 = pm.Normal("x_noisy_2", mu=P_i_final, sigma=avg_x_noise_2, shape=P_i_final.shape)
##### priors on known
C = pm.Uniform('C', 0, 70)
F = 0.6 #pm.TruncatedNormal('F', mu = 0.6, sigma = 0.2, lower = 0)
dp = pm.Uniform("dp", 0.001,35)
#### likelihood on y
y_p = pm.MutableData("y_p", y)
pred_ = pm.Deterministic("pred_", x_noisy_1 - x_noisy_2 - x_noisy_0/F + C / F- dp)
obs_ = pm.Normal('y', mu=pred_, sigma=0.001, observed=y_p)
trace_min_press_increase_test_2 = pm.sample(2000, tune=1500, chains=8, cores=8, init='jitter+adapt_diag', random_seed= 7, target_accept = 0.99)