Question

In the book Probabilistic graphical models - principles and techniques, Daphne Koller and Nir Friedman introduce the noisy-OR canonical model for CPDs (in the independence of causal inference family of aggregators) and go on to say this can be extended to noisy-MAX. However the details on how to do this are omitted.

The OpenMarkov project (http://www.openmarkov.org/) notes talk about a noisy-MAX model but again the details on how this is implemented are absent.

The noisy-OR model takes in a number of binary variables $X_i$, for $i=1,2,...n$. It then transforms them to an intermediate set of binary random variables $Z_i$ such that $Z_i=1$ with probability $\lambda_i$ when $X_i=1$ and 0 otherwise. Adds in the binary variable $Z_0$, called the leak, which is 1 with probability $\lambda_0$ (and 0 otherwise). The resulting (binary) output is then the OR over all the $Z_j$, $j=0,1,2,...,n$.

It is unclear how one should extend this to MAX, since directly replacing OR with MAX will yield exactly the same as with OR. (A series of 1s and 0s OR'd is the same result as MAX'ing it.)

What is the methodology/algorithm/formula used for calculating the noisy-MAX?

Answers should also whether the inputs and output can be generalised to non-binary (or even numerical/continuous) types.

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top