Entropy 2010, 12(4), 720-771; doi:10.3390/e12040720
Article

Statistical Information and Uncertainty: A Critique of Applications in Experimental Psychology

Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, UK
Received: 11 February 2010; in revised form: 10 March 2010 / Accepted: 1 April 2010 / Published: 7 April 2010
(This article belongs to the Special Issue Complexity of Human Language and Cognition)
PDF Full-text Download PDF Full-Text [446 KB, uploaded 7 April 2010 09:08 CEST]
Abstract: This paper presents, first, a formal exploration of the relationships between information (statistically defined), statistical hypothesis testing, the use of hypothesis testing in reverse as an investigative tool, channel capacity in a communication system, uncertainty, the concept of entropy in thermodynamics, and Bayes’ theorem. This exercise brings out the close mathematical interrelationships between different applications of these ideas in diverse areas of psychology. Subsequent illustrative examples are grouped under (a) the human operator as an ideal communications channel, (b) the human operator as a purely physical system, and (c) Bayes’ theorem as an algorithm for combining information from different sources. Some tentative conclusions are drawn about the usefulness of information theory within these different categories. (a) The idea of the human operator as an ideal communications channel has long been abandoned, though it provides some lessons that still need to be absorbed today. (b) Treating the human operator as a purely physical system provides a platform for the quantitative exploration of many aspects of human performance by analogy with the analysis of other physical systems. (c) The use of Bayes’ theorem to calculate the effects of prior probabilities and stimulus frequencies on human performance is probably misconceived, but it is difficult to obtain results precise enough to resolve this question.
Keywords: Bayes’ Theorem; category judgment; channel capacity; choice-reaction times; entropy; Hick’s Law; information theory; signal detection; uncertainty; Weber’s Law

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Laming, D. Statistical Information and Uncertainty: A Critique of Applications in Experimental Psychology. Entropy 2010, 12, 720-771.

AMA Style

Laming D. Statistical Information and Uncertainty: A Critique of Applications in Experimental Psychology. Entropy. 2010; 12(4):720-771.

Chicago/Turabian Style

Laming, Donald. 2010. "Statistical Information and Uncertainty: A Critique of Applications in Experimental Psychology." Entropy 12, no. 4: 720-771.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert