next up previous
Next: Joint inference and marginalization Up: Uncertainties from systematic effects Previous: Uncertainties from systematic effects


Reweighting of conditional inferences

The values of the influence variables and their uncertainties contribute to our background knowledge $I$ about the experimental measurements. Using $I_0$ to represent our very general background knowledge, the posterior pdf will then be $p(\mu \,\vert\,{\mbox{\boldmath$d$}},{\mbox{\boldmath$h$}},I_0)$, where the dependence on all possible values of ${\mbox{\boldmath$h$}}$ has been made explicit. The inference that takes into account the uncertain vector ${\mbox{\boldmath$h$}}$ is obtained using the rules of probability (see Tab. 1) by integrating the joint probability over the uninteresting influence variables:

$\displaystyle p(\mu \,\vert\,{\mbox{\boldmath$d$}}, I_0)$ $\textstyle =$ $\displaystyle \int\! p(\mu,{\mbox{\boldmath$h$}} \,\vert\,{\mbox{\boldmath$d$}}, I_0) \,\mbox{d}{\mbox{\boldmath$h$}}$ (72)
  $\textstyle =$ $\displaystyle \int\! p(\mu \,\vert\,{\mbox{\boldmath$d$}},{\mbox{\boldmath$h$}},I_0)
\,p({\mbox{\boldmath$h$}} \,\vert\,I_0)\,\mbox{d}{\mbox{\boldmath$h$}} \,.$ (73)

As a simple, but important case, let us consider a single influence variable given by an additive instrumental offset $z$, which is expected to be zero because the instrument has been calibrated as well as feasible and the remaining uncertainty is $\sigma_z$. Modelling our uncertainty in $z$ as a Gaussian distribution with a standard deviation $\sigma_z$, the posterior for $\mu $ is
$\displaystyle p(\mu \,\vert\,d,I_0)$ $\textstyle =$ $\displaystyle \int_{-\infty}^{+\infty}\!p(\mu \,\vert\,d,z,\sigma,I_0)
\,p(z\,\vert\sigma_z,I_0)\,\mbox{d}z$ (74)
  $\textstyle =$ $\displaystyle \int_{-\infty}^{+\infty}\!
\frac{1}{\sqrt{2\pi}\,\sigma}\exp\left[
-\frac{(\mu-(d-z))^2}{2\,\sigma^2}\right] \times \,$  
    $\displaystyle \ \ \ \ \ \ \ \frac{1}{\sqrt{2\pi}\,\sigma_z}\exp\left[
-\frac{z^2}{2\,\sigma_z^2}\right] \,\mbox{d}z$ (75)
  $\textstyle =$ $\displaystyle \frac{1}{\sqrt{2\pi}\,\sqrt{\sigma^2+\sigma_z^2}}\exp\left[
-\frac{(\mu-d)^2}{2\,(\sigma^2+\sigma_z^2)}\right] \, .$ (76)

The result is that the net variance is the sum of the variance in the measurement and the variance in the influence variable.


next up previous
Next: Joint inference and marginalization Up: Uncertainties from systematic effects Previous: Uncertainties from systematic effects
Giulio D'Agostini 2003-05-13