next up previous
Next: Conclusions Up: Poisson background on the Previous: Inferring

Inferring $n_s$ and $\lambda _s$

The histograms of Fig. 11 show examples of the probability distributions of $n_s$ for $\lambda _b=4$ and three different hypotheses for $p_b$.
Figure: Inference about $n_s$ (histograms) and $p_s$ (continuous lines) for $n=12$ and $x=9$, assuming $\lambda _b=4$ and three values of $p_b$: 0.75, 0.25 and 0.95 (top down).
\begin{figure}\begin{center}
\begin{tabular}{\vert c\vert c\vert}
\hline
\epsfig...
....eps,clip=,width=0.46\linewidth}\\
\hline
\end{tabular}\end{center}\end{figure}
These distributions quantify how much we believe that $n_s$ out of the observed $n$ belong to the signal. [By the way, the number $n_b$ of background objects present in the data can be inferred as complement to $n_s$, since the two numbers are linearly dependent. It follows that $f(n_b\,\vert\,x,\,n,\,\lambda_b,\,p_b) =
f(n-n_s\,\vert\,x,\,n,\,\lambda_b,\,p_b)$.]

A different question is to infer the the Poisson $\lambda _s$ of the signal. Using once more Bayes theorem we get, under the hypothesis of $n_s$ signal objects:

$\displaystyle f(\lambda_s\,\vert\,n_s)$ $\textstyle \propto$ $\displaystyle f(n_s\,\vert\,{\cal P}_{\lambda_s}) \,.
f_0(\lambda_s)$ (45)

Assuming a uniform prior for $\lambda _s$ we get (see e.g. Ref. [2]):
$\displaystyle f(\lambda_s\,\vert\,n_s)$ $\textstyle =$ $\displaystyle \frac{e^{-\lambda_s}\,\lambda_s^{n_s}}{n_s!}\,,$ (46)

with expected value and variance both equal to $n_s+1$ and mode equal to $n_s$ (the expected value is shifted on the right side of the mode because the distribution is skewed to the right). Figure 12 shows these pdf's, for $n_s$ ranging from 0 to 12 and assuming a uniform prior for $\lambda _s$.
Figure: Inference of $\lambda _s$ depending on the $n_s$, ranging from 0 to 12 (left to right curves).
\begin{figure}\centering\epsfig{file=invpois_ns_0_12.eps,clip=,width=\linewidth}\end{figure}

As far the pdf of $\lambda _s$ that depends on all possible values of $n_s$, each with is probability, is concerned, we get from probability theory [and remembering that, indeed, $f(\lambda_s\,\vert\,n_s,\,x,\,n,\,\lambda_b,\,p_b)$ is equal to $f(\lambda_s\,\vert\,n_s)$, because $n_s$ depends only on $\lambda _s$, and then the other way around]:

$\displaystyle f(\lambda_s\,\vert\,x,\,n,\,\lambda_b,\,p_b)$ $\textstyle \propto$ $\displaystyle \sum_{n_s} f(\lambda_s\,\vert\,n_s)\,f(n_s\,\vert\,x,\,n,\,\lambda_b,\,p_b)\,,$ (47)

i.e. the pdf of $\lambda _s$ is the weighted average2 of the several $n_s$ depending pdf's.

The results for the example we are considering in this section are given in the plots of Fig. 11.


next up previous
Next: Conclusions Up: Poisson background on the Previous: Inferring
Giulio D'Agostini 2004-12-13