Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families
AbstractWe present a simple computational approach to assigning a measure of complexity and information/entropy to families of natural languages, based on syntactic parameters and the theory of error correcting codes. We associate to each language a binary string of syntactic parameters and to a language family a binary code, with code words the binary string associated to each language. We then evaluate the code parameters (rate and relative minimum distance) and the position of the parameters with respect to the asymptotic bound of error correcting codes and the Gilbert–Varshamov bound. These bounds are, respectively, related to the Kolmogorov complexity and the Shannon entropy of the code and this gives us a computationally simple way to obtain estimates on the complexity and information, not of individual languages but of language families. This notion of complexity is related, from the linguistic point of view to the degree of variability of syntactic parameter across languages belonging to the same (historical) family. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Marcolli, M. Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families. Entropy 2016, 18, 110.
Marcolli M. Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families. Entropy. 2016; 18(4):110.Chicago/Turabian Style
Marcolli, Matilde. 2016. "Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families." Entropy 18, no. 4: 110.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.