Next: Reference priors Up: General principle based priors Previous: Transformation invariance

### Maximum-entropy priors

Another principle-based approach to assigning priors is based on in the Maximum Entropy principle (Jaynes 1957a, also 1983, 1998, Tribus 1969, von der Linden 1995, Sivia 1997, and Fröhner 2000). The basic idea is to choose the prior function that maximizes the Shannon-Jaynes information entropy,
 (105)

subject to whatever is assumed to be known about the distribution. The larger is, the greater is our ignorance about the uncertain value of interest. The value is obtained for a distribution that concentrates all the probability into a single value. In the case of no constraint other than normalization, ( ), is maximized by the uniform distribution, , which is easily proved using Lagrange multipliers. For example, if the variable is an integer between 0 and 10, a uniform distribution gives . Any binomial distribution with gives a smaller value, with a maximum of for and a limit of for or or , where is now the parameter of the binomial that gives the probability of success at each trial.

Two famous cases of maximum-entropy priors for continuous variables are when the only information about the distribution is either the expected value or the expected value and the variance. Indeed, these are special cases of general constraints on the moments of the distribution (see Tab. 1). For and 1, is equal to unity and to the expected value, respectively. First and second moment together provide the variance (see Tab. 1 and Sect. 5.6). Let us sum up what the assumed knowledge on the various moments provides [see e.g. (Sivia 1997, Dose 2002)].

Normalization alone provides a uniform distribution over the interval in which the variable is defined:
 (106)

This is the extension to continuous variables of the discrete case we saw above.

Adding to the constraint the knowledge about the expectation of the variable, plus the requirement that all non-negative values are allowed, an exponential distribution is obtained:
 (107)