Normalization factor and other moments of interest

The normalization factor $N_f$ is given by the integral in d$p$ of this expression, once $f_0(p)$ has been chosen. As we have done in the previous section, we opt for Beta$(r_0, s_0)$, taking the advantage not only of the flexibility of the probability distribution to model our `prior judgment' on $p$, but also of its mathematical convenience. In fact, with this choice, the resulting term in Eq. ([*]) depending on $p$ is given by $p^{r_0-1+n_I}\cdot (1-p)^{s_0-1+(n_s-n_I)}$. The integral over $p$ from 0 to 1 yields again a Beta function, that is B$(r_0+n_I,s_0+n_s-n_I)$, thus getting
$\displaystyle N_f\!\!$ $\displaystyle =$ $\displaystyle \sum_{n_{p_I}=0}^{n_P}\sum_{n_I=0}^{n_s}
\left[\!\binom{n_I}{n_{P...
...{P_I}+r_1)\cdot \Gamma(n_I-n_{P_I}+s_1)}
{ \Gamma(r_1 + n_I+s_1)} \cdot
\right.$  
    $\displaystyle \left. \frac{\Gamma (n_P-n_{P_I}+r_2)\cdot \Gamma(n_s-n_I-n_P+n_{P_I}+s_2)}
{\Gamma(n_s-n_I+s_2+r_2)} \right. \cdot$ (99)
    $\displaystyle \left.\frac{\Gamma(r_0+n_I)\cdot\Gamma(s_0+n_s-n_I)}{\Gamma(r_0+s_0+n_s)}\right]$  

Similarly, we can evaluate the expression of the expected values of $p$ and of $p^2$, from which the variance follows, being $\sigma^2(p)=$E$(p^2)-$E$^2(p)$. For example, being E$(p)$ given by
E$\displaystyle (p)$ $\displaystyle =$ $\displaystyle \int_0^1p\cdot f(p\,\vert\,I)$d$\displaystyle p\,,$  

in the integral the term depending on $p$ becomes $p\cdot p^{r_0-1+n_I}\cdot (1-p)^{s_0-1+(n_s-n_I)}$, increasing the power of $p$ by 1 and thus yielding
E$\displaystyle (p) \!$ $\displaystyle =$ $\displaystyle \frac{1}{N_f}\cdot \sum_{n_{p_I}=0}^{n_P}\sum_{n_I=0}^{n_s}
\left...
...\cdot \Gamma(n_I-n_{P_I}+s_1)}
{ \Gamma(r_1 + n_I+s_1)}\right.\!\cdot \nonumber$  
    $\displaystyle \left. \frac{\Gamma (n_P-n_{P_I}+r_2)\cdot \Gamma(n_s-n_I-n_P+n_{P_I}+s_2)}
{\Gamma(n_s-n_I+s_2+r_2)} \right.\!\cdot\!$ (100)
    $\displaystyle \left.\frac{\Gamma(r_0+n_I\mbox{\boldmath$+1$})\cdot\Gamma(s_0+n_s-n_I)}
{\Gamma(r_0+s_0+n_s\mbox{\boldmath$+1$})}\right]\,,$  

while E$(p^2)$ is obtained replacing ` $+1$' by ` $+2$'. A script to evaluate expected value and standard deviation of $p$ is provided in Appendix B.13.

The expression can be extended to ` $+3$' by ` $+4$', thus getting E$(p^3)$ and E$(p^4)$, from which skewness and kurtosis can be evaluated. Finally, making use of the so called Pearson Distribution System implemented in R [14], $f(p)$ can be obtained with a quite high degree of accuracy, unless the distribution is squeezed towards 0 o 1, as e.g. in Fig. [*].57 A script to evaluate mean, variance, skewness and kurtosis, and from them $f(p)$ by the Pearson Distribution System is shown in Appendix B.14.