This paper relates generalized measures of information to expected likelihood functions (ELFs) derived from Bayes' equation. It then demonstrates that Jaynes' formalism may be extended to formulate a class of minimally-prejudiced models of which those derived from Shannon's measure are but a limiting and special case. The rôle of probable inference and of information-minimizing models in design is commented on.
Get full access to this article
View all access options for this article.
References
1.
AczélJ.ForteB.NgC. T., 1974, “Why the Shannon and Hartley entropies are natural”, Advanced Applications in Probability, 6, 131–146.
2.
BattyM., 1974, “A theory of Markovian design machines”, Environment and Planning B, 1, 125–146.
3.
BattyM.MarchL., 1975, “The method of residues in urban modelling”, The Transport Group Publication Series, Department of Civil Engineering, University of Waterloo, Ontario, Canada.
4.
CoxR. T., 1961, The Algebra of Probable Inference (The Johns Hopkins Press, Baltimore).
5.
DaróczyA., 1970, “Generalized information functions”, Information and Control, 16, 36–51.
6.
EvansR. A., 1969, “The principle of minimum information”, IEEE Transactions on Reliability, 18, 87–90.
7.
FeiblemanJ. K., 1970, An Introduction to the Philosophy of Charles S. Peirce (MIT Press, Cambridge, Mass.).
8.
HavdraJ.CharvátF., 1967, “Quantification methods of classification processes: Concepts of structural α-entropy”, Kybernetika, 3, 30–34.
9.
HobsonA.ChengB. K., 1973, “A comparison of the Shannon and Kullback information measures”, Journal of Statistical Physics, 7, 302–310.
10.
JaynesE. T., 1957, “Information theory and statistical mechanics” and “Information theory and statistical mechanics II”, Physical Review, 106, 620–630, 108, 117–190.
11.
JaynesE. T., 1968, “Prior probabilities”, IEEE Transactions on Systems, Man and Cybernetics, 3, 227–241.
12.
KullbackS., 1959, Information Theory and Statistics (John Wiley, New York).
13.
MarchL., 1975, “The logic of design and the question of value” in The Architecture of Form (Cambridge University Press, London). In the press.
14.
MarchL.BattyM., 1975a, “Information-minimizing formalism and the derivation of nonparametric forms for population and transportation models”, Technical Report, 6-S, Department of Systems Design, University of Waterloo, Ontario, Canada.
15.
MarchL.BattyM., 1975b, “Measures of information and the derivation of statistical models”, Technical Report, 7-S, Department of Systems Design, University of Waterloo, Ontario, Canada.
16.
RenyiA., 1961, “On measures of entropy and information”, Proceedings of the Fourth Symposium on Mathematical Statistics and Probability, 1, 547–561.
17.
ShannonC. F., 1948, “A mathematical theory of communication”, Bell System Technical Journal, 27, 379–423, 623–656.
18.
SimonH. A., 1969, The Sciences of the Artificial (MIT Press, Cambridge, Mass.).
19.
TanejaI. J., 1974, “A joint characterization of directed divergence, inaccuracy, and their generalizations”, Journal of Statistical Physics, 11, 169–176.
20.
TribusM., 1969, Rational Descriptions, Decisions and Designs (Pergamon Press, New York).
21.
TribusM.RossiR., 1973, “On the Kullback information measure as a basis for information theory: Comments on a proposal by Hobson and Cheng”, Journal of Statistical Physics, 9, 331–338.