The first idea that might come to the mind is to apply the well known
weighted average of the individual values, using as weights the
inverses of the variances.
But, before doing it, it is important
to understand the assumptions behind it, that is something
that goes back to none other than Gauss, and for which
we refer to Refs. [29,44].
The basic idea of Gauss was to get two numbers (let us say
`central value' and standard deviation
- indeed Gauss used, instead of the standard deviation,
what he called `degree of precision' and
`degree of accuracy' [44], but
this is an irrelevant detail) such that
they contain the same information of the individual values.
In practice the rule of combination had to satisfy what
is currently known as statistical sufficiency.
Now it is not obvious at all that the weighted average
using
E and
satisfies sufficiency
(see e.g. the puzzle proposed in the Appendix of Ref. [44]).
Therefore, instead of trying to apply the weighted average
as a `prescription', let us see what comes out applying
consistently the rules of probability on a suitable model,
restarting from that of Fig. .
It is clear that if we consider meaningful a
combined value of
for all
instances of
it means we assume
not depending on a quantity
.
However,
could. This implies that the values of
are strongly correlated to each other.35Therefore the graphical model
of interest would be that at the top of Fig.
.
A trivial case is when both rates, and therefore their ratio, are assumed
to be constant, although unknown, yielding then
the graphical model shown in the bottom diagram of
Fig. , whose related joint pdf, evaluated
by the best suited chain rule, is an extension of
Eqs. (
)-(
)
![]() |
![]() |
![]() |
(142) |
![]() |
![]() |
![]() |
(143) |
![]() |
![]() |
![]() |
(144) |