next up previous contents
Next: Distribution of several random Up: Random variables Previous: Discrete variables   Contents


Continuous variables: probability and density function

Moving from discrete to continuous variables there are the usual problems with infinite possibilities, similar to those found in Zeno's ``Achilles and the tortoise'' paradox. In both cases the answer is given by infinitesimal calculus. But some comments are needed:

After this short introduction, here is a list of definitions, properties and notations:

Cumulative distribution function:

$\displaystyle F(x) = P(X \leq x) = \int_{-\infty}^{x} f(x^\prime) \, \mathrm{d} x^\prime \, ,$ (4.26)

or

$\displaystyle f(x) = \frac{\rm {d}F(x)}{\rm {d}x}$ (4.27)

Properties of $ f(x)$ and $ F(x)$:
Expectation value:

E$\displaystyle [X]$ $\displaystyle =$ $\displaystyle \int_{-\infty}^{+\infty}x\, f(x)\,\rm {d}x,$ (4.28)
E$\displaystyle [g(X)]$ $\displaystyle =$ $\displaystyle \int_{-\infty}^{+\infty}g(x)\, f(x)\,\rm {d}x.$ (4.29)

Uniform distribution:
4.1

$ X \sim {\cal K}(a,b)$:
$\displaystyle f(x\,\vert\,{\cal K}(a,b))$ $\displaystyle =$ $\displaystyle \frac{1}{b-a}
\hspace{0.6cm}(a\le x \le b),$ (4.30)
$\displaystyle F(x\,\vert\,{\cal K}(a,b))$ $\displaystyle =$ $\displaystyle \frac{x-a}{b-a}\, .$ (4.31)

Expectation value and standard deviation:
$\displaystyle \mu$ $\displaystyle =$ $\displaystyle \frac{a+b}{2},$ (4.32)
$\displaystyle \sigma$ $\displaystyle =$ $\displaystyle \frac{b-a}{\sqrt{12}}\,.$ (4.33)

Normal (Gaussian) distribution:

$ X\sim {\cal N}(\mu,\sigma)$:

$\displaystyle f(x\,\vert\,{\cal N}(\mu,\sigma)) =\frac{1}{\sqrt{2\,\pi}\,\sigma...
...u < +\infty\\ 0 < \sigma < \infty\\ -\infty < x < +\infty \end{array}\right.\,,$ (4.34)

where $ \mu$ and $ \sigma$ (both real) are the expectation value and standard deviation4.2, respectively.
Standard normal distribution:

the particular normal distribution of mean 0 and standard deviation 1, usually indicated by $ Z$:

$\displaystyle Z\sim {\cal N}(0,1)\,.$ (4.35)

Exponential distribution:

$ T \sim {\cal E}(\tau)$:

$\displaystyle f(t\,\vert\,{\cal E}(\tau))$ $\displaystyle =$ $\displaystyle \frac{1}{\tau}\, e^{-t/\tau} \hspace{1.3 cm}
\left\{ \begin{array}{c}
0 \le \tau < \infty, \\
0 \le t < \infty
\end{array}\right.$ (4.36)
$\displaystyle F(t\,\vert\,{\cal E}(\tau))$ $\displaystyle =$ $\displaystyle 1-e^{-t/\tau}.$ (4.37)

We use the symbol $ t$ instead of $ x$ because this distribution will be applied to the time domain.
Survival probability:

$\displaystyle P(T>t) = 1- F(t\,\vert\,{\cal E}(\tau)) = e^{-t/\tau}.$ (4.38)

Expectation value and standard deviation:
$\displaystyle \mu$ $\displaystyle =$ $\displaystyle \tau$ (4.39)
$\displaystyle \sigma$ $\displaystyle =$ $\displaystyle \tau.$ (4.40)

The real parameter $ \tau$ has the physical meaning of lifetime.
Poisson $ \leftrightarrow$ Exponential:

If $ X$ (= ``number of counts during the time $ \Delta t$'') is Poisson distributed then $ T$ (= ``interval of time to wait -- starting from any instant -- before the first count is recorded'') is exponentially distributed:

$\displaystyle X \sim f(x\,\vert\,{\cal P}_\lambda)$ $\displaystyle \Longleftrightarrow$ $\displaystyle T \sim f(x\,\vert\,{\cal E}(\tau))$ (4.41)
  $\displaystyle (\tau = \frac{\Delta T}{\lambda})$ $\displaystyle .$ (4.42)


next up previous contents
Next: Distribution of several random Up: Random variables Previous: Discrete variables   Contents
Giulio D'Agostini 2003-05-15