This paper critically reexamines the Jaynes interpretation of finite maximum-entropy distributions as limits of the ‘most likely’ relative-frequency distributions consistent with any given prior information. It is shown that while this interpretation is valid in a number of cases, it fails to hold for many types of prior information. Given this limitation, a weaker form of the Jaynes interpretation is proposed which is shown to hold for a wider class of prior information.
Get full access to this article
View all access options for this article.
References
1.
BlumenthalL. M., 1970, Theory and Applications of Distance Geometry, second edition (Chelsea, New York).
2.
FellerW., 1957, Introduction to Probability Theory and Its Applications, 2nd edition (John Wiley, New York).
GibbsJ. W., 1902, Elementary Principles of Statistical Mechanics (Yale University Press, New Haven, Conn.).
5.
JaynesE. T., 1957, “Information theory and statistical mechanics”, Physical Review, 106, 620–630.
6.
JaynesE. T., 1968, “Prior probabilities”, IEEE Transactions on Systems Science and Cybernetics, SSC-4, September, 227–241.
7.
KashyapR. L., 1971, “Prior probability and uncertainty”, IEEE Transactions on Information Theory, IT-17 (6), September, 641–650.
8.
KeilsonJ., 1971, “A note on the summability of the entropy series”, Information and Control, 18, 257–260.
9.
KhinchinA. I., 1958, Mathematical Foundations of Information Theory (Dover Publications, New York).
10.
ShannonC. E., 1948, “A mathematical theory of communication”, Bell System Technical Journal, 27, 379–423, 623–656.
11.
SmithT. E., 1972, “On the relative-frequency interpretation of finite maximum-entropy distributions”, discussion paper 54, Regional Science Research Institute, Philadelphia, Pennsylvania, USA.
12.
WilderR. L., 1965, Introduction to the Foundations of Mathematics, 2nd edition (John Wiley, New York).