Next Article in Journal
Kinetics of Interactions of Matter, Antimatter and Radiation Consistent with Antisymmetric (CPT-Invariant) Thermodynamics
Next Article in Special Issue
Face Verification with Multi-Task and Multi-Scale Feature Fusion
Previous Article in Journal
The Solution of Modified Fractional Bergman’s Minimal Blood Glucose-Insulin Model
Previous Article in Special Issue
Consistent Estimation of Partition Markov Models
Article Menu
Issue 5 (May) cover image

Export Article

Open AccessLetter
Entropy 2017, 19(5), 198; doi:10.3390/e19050198

Discovery of Kolmogorov Scaling in the Natural Language

Department of Physics and Astronomy, Sejong University, Seoul 143-747, Korea
Academic Editors: Maxim Raginsky and Raúl Alcaraz Martínez
Received: 16 February 2017 / Revised: 25 April 2017 / Accepted: 26 April 2017 / Published: 2 May 2017
(This article belongs to the Special Issue Information Theory in Machine Learning and Data Science)
View Full-Text   |   Download PDF [4928 KB, uploaded 2 May 2017]   |  

Abstract

We consider the rate R and variance σ 2 of Shannon information in snippets of text based on word frequencies in the natural language. We empirically identify Kolmogorov’s scaling law in σ 2 k - 1 . 66 ± 0 . 12 (95% c.l.) as a function of k = 1 / N measured by word count N. This result highlights a potential association of information flow in snippets, analogous to energy cascade in turbulent eddies in fluids at high Reynolds numbers. We propose R and σ 2 as robust utility functions for objective ranking of concordances in efficient search for maximal information seamlessly across different languages and as a starting point for artificial attention. View Full-Text
Keywords: Shannon information; concordances; ranking; search; attention Shannon information; concordances; ranking; search; attention
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

van Putten, M.H.P.M. Discovery of Kolmogorov Scaling in the Natural Language. Entropy 2017, 19, 198.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top