next up previous contents
Next: Unfolding an experimental distribution Up: Bayesian unfolding Previous: Problem and typical solutions   Contents

Bayes' theorem stated in terms of causes and effects

Let us state Bayes' theorem in terms of several independent causes ( $ C_i,\ i=1, 2, \ldots, n_C$) which can produce one effect ($ E$). For example, if we consider deep-inelastic scattering events, the effect $ E$ can be the observation of an event in a cell of the measured quantities $ \{\Delta Q^2_{meas}, \Delta x_{meas}\}$. The causes $ C_i$ are then all the possible cells of the true values $ \{\Delta Q^2_{true}, \Delta x_{true}\}_i$. Let us assume we know the initial probability of the causes $ P(C_i)$ and the conditional probability that the $ i$-th cause will produce the effect $ P(E\,\vert\,C_i)$. The Bayes formula is then

$\displaystyle P(C_i\,\vert\,E) = \frac{P(E\,\vert\,C_i)\, P(C_i)} {\sum_{l=1}^{n_C} P(E\,\vert\,C_l)\, P(C_l)}\, .$ (7.1)

$ P(C_i\,\vert\,E)$ depends on the initial probability of the causes. If one has no better prejudice concerning $ P(C_i)$ the process of inference can be started from a uniform distribution.

The final distribution depends also on $ P(E\,\vert\,C_i)$. These probabilities must be calculated or estimated with Monte Carlo methods. One has to keep in mind that, in contrast to $ P(C_i)$, these probabilities are not updated by the observations. So if there are ambiguities concerning the choice of $ P(E\,\vert\,C_i)$ one has to try them all in order to evaluate their systematic effects on the results.


next up previous contents
Next: Unfolding an experimental distribution Up: Bayesian unfolding Previous: Problem and typical solutions   Contents
Giulio D'Agostini 2003-05-15