Robin Willink
Industrial Research Ltd
Mathematical entropy and a criticism of a usage of maximum-entropy distributions
The 'principle of maximum entropy' is sometimes used for assigning prior distributions to unknown parameters in Bayesian analyses. Similarly, it has been advocated for choosing a probability distribution to represent individual unknowns such as systematic deviations in measurement problems (Evaluation of measurement data - Supplement 1 to the "Guide to the expression of uncertainty in measurement" - Propagation of distributions using a Monte Carlo method, JCGM 101:2008, [which is available from
http://www.bipm.org/en/publications/guides/gum.html ].) We show how supporting claims like 'the maximum-entropy distribution is minimally committal' are indefensible in this context. We examine the origin of the idea of entropy in communication theory, and conclude that the idea that entropy measures 'information' only has meaning with sequences of categorical random variables. So it cannot legitimately be associated with 'information' about individual parameters.