Next Article in Journal
Engineering Model Reduction and Entropy-based Lyapunov Functions in Chemical Reaction Kinetics
Next Article in Special Issue
Article Omission in Dutch Children with SLI: A Processing Approach
Previous Article in Journal
Entropy-Related Extremum Principles for Model Reduction of Dissipative Dynamical Systems
Previous Article in Special Issue
Comparative Analysis of Networks of Phonologically Similar Words in English and Spanish
Article Menu

Export Article

Open AccessArticle
Entropy 2010, 12(4), 720-771; doi:10.3390/e12040720

Statistical Information and Uncertainty: A Critique of Applications in Experimental Psychology

Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, UK
Received: 11 February 2010 / Revised: 10 March 2010 / Accepted: 1 April 2010 / Published: 7 April 2010
(This article belongs to the Special Issue Complexity of Human Language and Cognition)
View Full-Text   |   Download PDF [446 KB, uploaded 24 February 2015]   |  

Abstract

This paper presents, first, a formal exploration of the relationships between information (statistically defined), statistical hypothesis testing, the use of hypothesis testing in reverse as an investigative tool, channel capacity in a communication system, uncertainty, the concept of entropy in thermodynamics, and Bayes’ theorem. This exercise brings out the close mathematical interrelationships between different applications of these ideas in diverse areas of psychology. Subsequent illustrative examples are grouped under (a) the human operator as an ideal communications channel, (b) the human operator as a purely physical system, and (c) Bayes’ theorem as an algorithm for combining information from different sources. Some tentative conclusions are drawn about the usefulness of information theory within these different categories. (a) The idea of the human operator as an ideal communications channel has long been abandoned, though it provides some lessons that still need to be absorbed today. (b) Treating the human operator as a purely physical system provides a platform for the quantitative exploration of many aspects of human performance by analogy with the analysis of other physical systems. (c) The use of Bayes’ theorem to calculate the effects of prior probabilities and stimulus frequencies on human performance is probably misconceived, but it is difficult to obtain results precise enough to resolve this question.
Keywords: Bayes’ Theorem; category judgment; channel capacity; choice-reaction times; entropy; Hick’s Law; information theory; signal detection; uncertainty; Weber’s Law Bayes’ Theorem; category judgment; channel capacity; choice-reaction times; entropy; Hick’s Law; information theory; signal detection; uncertainty; Weber’s Law
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Laming, D. Statistical Information and Uncertainty: A Critique of Applications in Experimental Psychology. Entropy 2010, 12, 720-771.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top