So far, we have considered mainly likelihood-dominated situations,
in which the prior pdf can be included in the normalization
constant.
But one should be careful about the possibility of
uncritically use uniform priors, as a `prescription,'
or as a rule, though the rule might be associated with
the name of famous persons.
For instance, having made interviews
to infer the proportion
of a population
that supports a party, it is not reasonable to assume
a uniform prior of
between 0 and 1.
Similarly, having to infer the rate
of a Poisson process (such that
, where
is the
measuring time) related, for example, to proton decay,
cosmic ray events or gravitational wave signals,
we do not believe, strictly,
that
is uniform between zero and infinity. Besides
natural physical cut-off's (for example, very large
proton decay
would prevent Life, or even stars, to exist),
implies to believe more high orders of magnitudes
of
(see Astone and D'Agostini 1999 for details). In many cases
(for example the mentioned searches for rare phenomena) our
uncertainty could mean indifference over several orders of magnitude in
the rate
. This indifference can be parametrized roughly with a
prior uniform
yielding
(the same prior is obtainable using invariant arguments, as
it will be shown in a while).
As the reader might imagine, the choice of priors is a highly debated issue, also among Bayesians. We do not pretend to give definitive statements, but would just like to touch on some important issues concerning priors.