next up previous
Next: Bayes' theorem Up: Bayesian inference for simple Previous: Bayesian inference for simple


Background information

As we think about drawing conclusions about the physical world, we come to realize that everything we do is based on what we know about the world. Conclusions about hypotheses will be based on our general background knowledge. To emphasize the dependence of probability on the state of background information, which we designate as $I$, we will make it explicit by writing $P(E\,\vert\,I)$, rather than simply $P(E)$. (Note that, in general, $P(A \,\vert\, I_1) \neq P(A \,\vert\, I_2) $, if $I_1$ and $I_2$ are different states of information.) For example, Eq. (4) should be more precisely written as

\begin{displaymath}
P(A\cap B\,\vert\,I) = P(A\,\vert\,B\cap I) \,
P(B\,\vert\,I) = P(B\,\vert\,A\cap I) \, P(A\,\vert\,I)\,,
\end{displaymath} (16)

or alternatively as
\begin{displaymath}
P(A, B\,\vert\,I) = P(A\,\vert\,B, I) \, P(B\,\vert\,I) =
P(B\,\vert\,A, I) \, P(A\,\vert\,I) \,.
\end{displaymath} (17)

We have explicitly included $I$ as part of the conditional to remember that any probability relation is valid only under the same state of background information.



Giulio D'Agostini 2003-05-13