Probabilistic combination achieved by Monte
Carlo sampling using JAGS and rjags

The case just analyzed is so simple that, even without getting the solution in closed form, it is enough to plot the unnormalized pdf (7) to understand what is going on and to get `somehow' mean value and standard deviation. The problem becomes more serious in the case we want to make a multidimensional inference, taking into account also the correlations between the quantities of interest, as for example in the fit model of Fig. 6, taken from Ref. [20].
Figure: Graphical model for a non trivial fit with errors on both axes [20].
\begin{figure}\begin{center}
\begin{tabular}{c}
\epsfig{file=bn2.eps,clip=,}
\end{tabular}
\end{center}\end{figure}

Nowadays the most general way to handle problems of this kind is by sampling the unnormalized posterior distribution by a Markov Chain Monte Carlo (MCMC), using a suitable algorithm (see Ref. [21] for an introduction - given the imaginable interest of the subject in many fields, much more can be found searching on the web; in particular, particle physicists might be interested in BAT [22], the Bayesian Analysis Toolkit). Perhaps (said by a non expert) the most powerful MCMC algorithm is the so called Metropolis (with variants), but for the kind of problem in which we are interested in this paper the most convenient one is the so called Gibbs Sampler,12although it has some limitations on the conditional distributions it can handle (see [21] and [25] for details).

Instead of writing our own code, which would be anyway rather easy for our simple problem, we are going to use the program JAGS [24], born as an open source, multi-platform clone of BUGS. JAGS does not come with a graphical interface, so it is convenient to use it within a more general framework like R [27] via the package rjags [28] (those who are familiar with Python might want to use pyjags [29]).



Subsections