Combining joint probabilities
-
20-08-2019 - |
Question
I am trying to work out the expression for a probability distribution (related to bioinformatics), and am having trouble combining the information about a random variable from two different sources. Essentially, here is the scenario: There are 3 discrete random variables X, A & B. X depends on A and B. A and B are related only through X, i.e. A and B are independent given X. Now, I have derived the expressions for: P(X, A) and P(X, B). I need to calculate P(X, A, B) - this is not a straightforward application of the chain rule.
I can derive P(X | A) from the first expression since P(A) is available. B is never observed independently of A, P(B) is not readily available - at best I can approximate it by marginalizing over A, but the expression P(A, B) does not have a closed form so the integration is tricky.
Any thoughts on how P(X, A, B) can be derived, without discarding information? Many thanks in advance.
Amit
Solution
What you're dealing with here is an undirected acyclic graph. A is conditionally independent of B given X, but X depends (I assume directly) on A and B. I'm a little confused about the nature of your problem, i.e. what form your probability distributions are specified in, but you could look at belief propagation.
OTHER TIPS
Ok, it has been a long time since I've done joint probabilities so take this with a big grain of salt but the first place I would start looking, given that A and B are orthogonal, is for an expression something like:
P(X, A, B) = P(X,A) + (P(X,B) * (1-P(X,A)));
Again, this is just to give you an idea to explore as it has been a very long time since I did this type of work!
Your question is very unclear in terms of what you observe and what are unknowns. It seems like the only fact that you state clearly is A and B are independent given X. That is,
Assumption: P(A,B|X)=P(A|X)P(B|X)
Hence: P(A,B,X)=P(A,B|X)P(X)=P(A|X)P(B|X)P(X)=P(A,X)P(X)=P(B,X)P(X)
Take your pick of factorizations.