Hello. This question is strange and perhaps silly but it would be very useful for my research. Is there any method to find the likelihood given a prior distribution and its corresponding posterior distribution (both multinomial)?
////////
I am updating the question to be more informative:
I am working on a topic related to multiple-choice response. I would like to measure the efficiency of the information source (or a student’s information search) and I believe Bayesian statistics is the right approach; however, since I only know the basics, I would like to present it to see if my reasoning is coherent or not.
Given a multiple-choice question (e.g., with 4 answers) where only one answer is correct, a student can answer it solely based on what he/she studied at home, assigning a probability to each answer, for example, P(R)=(0.4, 0.3, 0.15, 0.15). The response might end there, or the student might consult a book and update those probabilities, for example, P(R/B)=(0.4,0.6,0,0). P(R) would be the prior probability, P(R/B) would be the posterior probability, both (multinomial) and P(B/R) would be the likelihood, which I understand to be the update of my prior belief P(R) and convert it in P(R/B).
So I want to know P(B/R) and I would also like to see if I can determine what P(B/R) would make P(R/B)=(1,0,0,0), that is, make the correct answer chosen with certainty.
I understand that this is a kind of inverse problem, which will not have a unique solution; but perhaps there is some heuristic to approximate a solution, at least for the first challenge of finding P(B/R).
Ideally, I would like to use PyMC3 for this.
I don’t know any method other than researching the literature/ textbooks
This is what I see in SAS proc MCMC:
" The PARMS statement lists the names of the parameters in the model and specifies optional initial values for these parameters. These parameters are referred to as the model parameters. You can specify multiple PARMS statements. Each PARMS statement defines a block of parameters, and the blocked Metropolis algorithm updates the parameters in each block simultaneously."
So it looks like in SAS it matters how you group the parameters. I was wondering if it worked the same way in PYMC. It seems like a function of how it was programmed.
Perhaps it would be useful for you to say more about what you are trying to do.
What a surprising answer, thank you very much. Could you please tell me what text to look for?
Yes! Thank you very much for your response.
It is my fault for not being more explicit. I am working on a topic related to multiple-choice response. I would like to measure the efficiency of the information source (or a student’s information search) and I believe Bayesian statistics is the right approach; however, since I only know the basics, I would like to present it to see if my reasoning is coherent or not.
Given a multiple-choice question (e.g., with 4 answers) where only one answer is correct, a student can answer it solely based on what he/she studied at home, assigning a probability to each answer, for example, P(R)=(0.4, 0.3, 0.15, 0.15). The response might end there, or the student might consult a book and update those probabilities, for example, P(R/B)=(0.4,0.6,0,0). P(R) would be the prior probability, P(R/B) would be the posterior probability, both (multinomial) and P(B/R) would be the likelihood, which I understand to be the update of my prior belief P(R) and convert it in P(R/B).
So I want to know P(B/R) and I would also like to see if I can determine what P(B/R) would make P(R/B)=(1,0,0,0), that is, make the correct answer chosen with certainty.
I understand that this is a kind of inverse problem, which will not have a unique solution; but perhaps there is some heuristic to approximate a solution, at least for the first challenge of finding P(B/R).
Thank you for your help!
This book may have some useful case studies about multiple choice scenarios: pymc-resources/BCM at main · pymc-devs/pymc-resources · GitHub
Thank you!