The most general problem would
be to evaluate the joint conditional probability
of the uncertain (`unobserved') quantities,
conditioned by the `observed' (`known'/ `assumed'/`postulated') ones,
that is, in this case46 (see Appendix A),
|
|
|
(79) |
although in practice we are indeed interested in
, and perhaps
in
and
.
This is done marginalizing Eq. ()
i.e. summing (or integrating, depending on their nature)
over the variables on which we are not interested
(see Appendix A).
As commented in the same appendix, Eq. ()
is obtained, apart from a normalization factor,
from
|
|
|
(80) |
and the latter from a properly chosen chain rule.
The steps used to build up Eq. () by
the proper chain rule
are exactly the instructions given to JAGS to set up
the model, if we start from the bottom of
the diagram of Fig.
and ascend through the parents (see Sec. ):
model {
nP ~ sum(nP.I, nP.NI)
nP.I ~ dbin(pi1, n.I)
nP.NI ~ dbin(pi2, n.NI)
pi1 ~ dbeta(r1, s1)
pi2 ~ dbeta(r2, s2)
n.I ~ dbin(p, ns)
n.NI <- ns - n.I
p ~ dbeta(r0,s0)
}
The differences with respect to the JAGS model of
Sec. are
- the last instruction there, `fP - nP/ns', is here
irrelevant;
- we have to add a prior to p, because
all unobserved nodes having no parents need a prior
(for practical convenience,
as we have seen in Sec. ,
we shall use a Beta distribution, as indicated in the code);
- the sequence of the statements has been changed, but this has
been done only in order to stress the analogy with
the chain rule constructed ascending the graphical model
of Fig. (let us remind that the order is irrelevant
for JAGS, which organizes all statements at the stage
of compilation).
Hereafter we proceed using, very conveniently,
JAGS,
showing in Sec. the steps needed from
writing down the chain rule till the exact evaluation of
after marginalization.