Next Article in Journal
One-Parameter Fisher–Rényi Complexity: Notion and Hydrogenic Applications
Previous Article in Journal
A Comparative Study of Empirical Mode Decomposition-Based Filtering for Impact Signal
Article

Humans Outperform Machines at the Bilingual Shannon Game

Information Sciences Institute, University of Southern California, 4676 Admiralty Way #1001, Marina Del Rey, CA 90292, USA
*
Author to whom correspondence should be addressed.
Both authors contributed equally to this work.
Academic Editors: Kevin H. Knuth and Raúl Alcaraz Martínez
Entropy 2017, 19(1), 15; https://doi.org/10.3390/e19010015
Received: 3 October 2016 / Revised: 8 December 2016 / Accepted: 27 December 2016 / Published: 30 December 2016
(This article belongs to the Section Information Theory, Probability and Statistics)
We provide an upper bound for the amount of information a human translator adds to an original text, i.e., how many bits of information we need to store a translation, given the original. We do this by creating a Bilingual Shannon Game that elicits character guesses from human subjects, then developing models to estimate the entropy of those guess sequences. View Full-Text
Keywords: compression; multilingual; translation compression; multilingual; translation
Show Figures

Figure 1

MDPI and ACS Style

Ghazvininejad, M.; Knight, K. Humans Outperform Machines at the Bilingual Shannon Game. Entropy 2017, 19, 15. https://doi.org/10.3390/e19010015

AMA Style

Ghazvininejad M, Knight K. Humans Outperform Machines at the Bilingual Shannon Game. Entropy. 2017; 19(1):15. https://doi.org/10.3390/e19010015

Chicago/Turabian Style

Ghazvininejad, Marjan, and Kevin Knight. 2017. "Humans Outperform Machines at the Bilingual Shannon Game" Entropy 19, no. 1: 15. https://doi.org/10.3390/e19010015

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop