Next Article in Journal
Reply to Jay Lawrence. Comments on Piero Quarati et al. Negentropy in Many-Body Quantum Systems. Entropy 2016, 18, 63
Next Article in Special Issue
Link between Lie Group Statistical Mechanics and Thermodynamics of Continua
Previous Article in Journal
Comments on Piero Quarati et al. Negentropy in Many-Body Quantum Systems. Entropy 2016, 18, 63
Previous Article in Special Issue
Riemannian Laplace Distribution on the Space of Symmetric Positive Definite Matrices
Article Menu

Export Article

Open AccessArticle
Entropy 2016, 18(4), 110; doi:10.3390/e18040110

Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families

Department of Mathematics, California Institute of Technology, Pasadena, CA 91125, USA
Academic Editors: Frédéric Barbaresco, Frank Nielsen and Kevin H. Knuth
Received: 14 January 2016 / Revised: 13 March 2016 / Accepted: 18 March 2016 / Published: 7 April 2016
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
View Full-Text   |   Download PDF [291 KB, uploaded 7 April 2016]

Abstract

We present a simple computational approach to assigning a measure of complexity and information/entropy to families of natural languages, based on syntactic parameters and the theory of error correcting codes. We associate to each language a binary string of syntactic parameters and to a language family a binary code, with code words the binary string associated to each language. We then evaluate the code parameters (rate and relative minimum distance) and the position of the parameters with respect to the asymptotic bound of error correcting codes and the Gilbert–Varshamov bound. These bounds are, respectively, related to the Kolmogorov complexity and the Shannon entropy of the code and this gives us a computationally simple way to obtain estimates on the complexity and information, not of individual languages but of language families. This notion of complexity is related, from the linguistic point of view to the degree of variability of syntactic parameter across languages belonging to the same (historical) family. View Full-Text
Keywords: syntax; principles and parameters; error-correcting codes; asymptotic bound; Kolmogorov complexity; Gilbert–Varshamov bound; Shannon entropy syntax; principles and parameters; error-correcting codes; asymptotic bound; Kolmogorov complexity; Gilbert–Varshamov bound; Shannon entropy
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Marcolli, M. Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families. Entropy 2016, 18, 110.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top