Next Article in Journal
Why Does Space Feel the Way it Does? Towards a Principled Account of Spatial Experience
Next Article in Special Issue
Entropy Rate Estimation for English via a Large Cognitive Experiment Using Mechanical Turk
Previous Article in Journal
Temperature Effects and Entropy Generation of Pressure Retarded Osmosis Process
Previous Article in Special Issue
Linguistic Laws in Speech: The Case of Catalan and Spanish
Open AccessArticle

Semantic Entropy in Language Comprehension

Department of Language Science & Technology, Saarland University, 66123 Saarbrücken, Germany
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(12), 1159; https://doi.org/10.3390/e21121159
Received: 30 October 2019 / Revised: 20 November 2019 / Accepted: 25 November 2019 / Published: 27 November 2019
(This article belongs to the Special Issue Information Theory and Language)
Language is processed on a more or less word-by-word basis, and the processing difficulty induced by each word is affected by our prior linguistic experience as well as our general knowledge about the world. Surprisal and entropy reduction have been independently proposed as linking theories between word processing difficulty and probabilistic language models. Extant models, however, are typically limited to capturing linguistic experience and hence cannot account for the influence of world knowledge. A recent comprehension model by Venhuizen, Crocker, and Brouwer (2019, Discourse Processes) improves upon this situation by instantiating a comprehension-centric metric of surprisal that integrates linguistic experience and world knowledge at the level of interpretation and combines them in determining online expectations. Here, we extend this work by deriving a comprehension-centric metric of entropy reduction from this model. In contrast to previous work, which has found that surprisal and entropy reduction are not easily dissociated, we do find a clear dissociation in our model. While both surprisal and entropy reduction derive from the same cognitive process—the word-by-word updating of the unfolding interpretation—they reflect different aspects of this process: state-by-state expectation (surprisal) versus end-state confirmation (entropy reduction). View Full-Text
Keywords: natural language; entropy; neural networks natural language; entropy; neural networks
Show Figures

Figure 1

MDPI and ACS Style

Venhuizen, N.J.; Crocker, M.W.; Brouwer, H. Semantic Entropy in Language Comprehension. Entropy 2019, 21, 1159.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop