next up previous contents
Next: Evaluation of uncertainty: general Up: A probabilistic theory of Previous: Afraid of `prejudices'? Inevitability   Contents


Recovering standard methods and short-cuts to Bayesian reasoning

Before moving on to applications, it is necessary to answer an important question: ``Should one proceed by applying Bayes' theorem in every situation?'' The answer is no, and the alternative is essentially implicit in ([*]), and can be paraphrased with the example of the dog and the hunter.

We have already used this example in Section [*], when we were discussing the arbitrariness of probability inversion performed unconsciously by (most of)2.13 those who use the scheme of confidence intervals. The same example will also be used in Section [*], when discussing the reason why Bayesian estimators appear to be distorted (a topic discussed in more detail in Section [*]). This analogy is very important, and, in many practical applications, it allows us to bypass the explicit use of Bayes' theorem when priors do not sizably influence the result (in the case of a normal model the demonstration can be seen in Section [*]).

Figure: Relation between Bayesian inference and standard data analysis methods. The top-down flow shows subsequent limiting conditions. For an understanding of the relation between the `normal' $ \chi^2$ and the Pearson $ \chi^2$ Ref. [24] is recommended.
\begin{figure}\centering\epsfig{file=dago77.eps,clip=,width=13cm}\vspace{1.0 cm}\end{figure}
Figure [*] shows how it is possible to recover standard methods from a Bayesian perspective. One sees that the crucial link is with the Maximum Likelihood Principle, which, in this approach is just a subcase (see Section [*]). Then, when extra simplifying restrictions are verified, the different forms of the Least Squares are reobtained. In conclusion:


next up previous contents
Next: Evaluation of uncertainty: general Up: A probabilistic theory of Previous: Afraid of `prejudices'? Inevitability   Contents
Giulio D'Agostini 2003-05-15