Although Shannon's information theory is alive and well in a number of fields, after an initial fad in psychology during the 1950s and 1960s it no longer is much of a factor, beyond the word bit, in psychological theory. The author discusses what seems to him (and others) to be the root causes of an actual incompatibility between information theory and the psychological phenomena to which it has been applied.
Get full access to this article
View all access options for this article.
References
1.
AczélJ., & DaróczyZ. (1975). On measures of information and their characterizations. New York: Academic Press.
2.
AczélJ., ForteB., & NgC. T. (1974). Why the Shannon and Hartley entropies are “natural.”. Advances in Applied Probability, 6, 131–146.
3.
AttneaveF. (1959). Applications of information theory to psychology. New York: Henry Holt.
4.
DevlinK. (2001). Claude Shannon, 1916–2001. Focus: The Newsletter of the Mathematical Association of America, 21, 20–21.
5.
DiamondS. (1959). Information and error: An introduction to statistical analysis. New York: Basic Books.
6.
EganJ. P. (1975). Signal detection theory and ROC analysis. New York: Academic Press.
7.
GarnerW. R. (1962). Uncertainty and structure as psychological concepts. New York: Wiley.
8.
GreenD. M., LuceR. D., & DuncanJ. E. (1977). Variability and sequential effects in magnitude production and estimation of auditory intensity. Perception & Psychophysics, 22, 450–456.
9.
GreenD. M., & SwetsJ. (1966). Signal detection theory and psychophysics. New York: Wiley.
10.
JaynesE. T. (1979). Where do we stand on maximum entropy?. In LevineR. D. & TribusM., The maximum entropy formalism (pp. 15–118). Cambridge, MA: MIT Press.
11.
JaynesE. T. (1986). Bayesian methods: An introductory tutorial. In JusticeJ. H., Maximum entropy and Bayesian methods in applied statistics (pp. 1–25). Cambridge, England: Cambridge University Press.
12.
JaynesE. T. (1988). Discussion. American Statistician, 42, 280–281.
13.
JohnsonG. (2001, February 27). Claude Shannon, mathematician, dies at 84 [obituary]. New York Times. p. B7.
14.
JusticeJ. H.. (1986). Maximum entropy and Bayesian methods in applied statistics. Cambridge, England: Cambridge University Press.
15.
KilleenP. R., & TaylorT. J. (2001). Bits of the ROC: Signal detection as information transmission. Unpublished manuscript.
16.
LamingD. R. J. (1968). Information theory of choice-reaction times. New York: Academic Press.
17.
LamingD. (2001). Statistical information, uncertainty, and Bayes' theorem: Some applications in experimental psychology. In BenferhatS. & BesnardP., Symbolic and quantitative approaches to reasoning with uncertainty (pp. 635–646). Berlin: Springer-Verlag.
18.
LaPlaceP. S. (1820). Theorie analytique des probabilitiés [Analytic theory of probability] (2 vols.; 3rd ed., with supplements). Paris: Courcier. (Original published 1812; reprints available from Editions Culture et Civilisation, 115 Avenue Gabriel Lebron, 1160 Brussels, Belgium)
19.
LevineR. D., & TribusM.. (1979). The maximum entropy formalism. Cambridge, MA: MIT Press.
20.
LuceR. D. (1960). The theory of selective information and some of its behavioral application. In LuceR. D., Developments in mathematical psychology (pp. 5–119). Glencoe, IL: Free Press.
21.
LuceR. D., GreenD. M., & WeberD. L. (1976). Attention bands in absolute identification. Perception & Psychophysics, 20, 49–54.
22.
MacmillanN. A., & CreelmanC. D. (1991). Detection theory: A user's guide. Cambridge, England: Cambridge University Press.
23.
MathaiA. M. (1975). Basic concepts in information theory and statistics: Axiomatic foundations and applications. New York: Wiley.
24.
McGillW. J. (1954). Multivariate information transmission. Psychometrika, 19, 97–116.
25.
MillerG. A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 63, 81–97.
26.
NorwichK. H. (1993). Information, sensation, and perception. San Diego, CA: Academic Press.
27.
QuastlerH. (1955). Information theory in psychology. Glencoe, IL: Free Press.
28.
ShannonC. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27, 379–423.623–656
29.
ShannonC. E., & WeaverW. (1949). The mathematical theory of communication. Urbana: University of Illinois Press.
30.
SkillingJ. (1989). Maximum entropy and Bayesian methods: Cambridge England 1988. Dordrecht, the Netherlands: Kluwer.
31.
SwetsJ.. (1964). Signal detection and recognition by human observers. New York: Wiley.
32.
TribusM. (1979). Thirty years of information theory. In LevineR. D. & TribusM., The maximum entropy formalism (pp. 1–14). Cambridge, MA: MIT Press.
33.
ZellnerA. (1988). Optimal information processing and Bayes' theorem. American Statistician, 42, 278–284.