Next Article in Journal
Diversification of Symbolic Systems
Previous Article in Journal
Reactivation of Herpes Simplex Virus (HSV) from Latency in Response to Neuronal Hyperexcitability
Open AccessProceedings

Towards Demystifying Shannon Entropy, Lossless Compression and Approaches to Statistical Machine Learning

by Hector Zenil 1,2,3
1
Information Dynamics Lab, Karolinska Institute, 171 76 Stockholm, Sweden
2
Oxford Immune Algorithmics, Reading RG3 1EU, UK
3
Algorithmic Nature Group, LABORES, 75005 Paris, France
Conference Morphological, Natural, Analog and Other Unconventional Forms of Computing for Cognition and Intelligence (MORCOM), Berkeley, CA, USA, 2–6 June 2019.
Proceedings 2020, 47(1), 24; https://doi.org/10.3390/proceedings2020047024
Published: 19 June 2020
(This article belongs to the Proceedings of IS4SI 2019 Summit)
Current approaches in science, including most machine and deep learning methods, rely heavily at their core on traditional statistics and information theory, but these theories are known to fail to capture certain fundamental properties of data and the world related to recursive and computable phenomena, and they are ill-equipped to deal with high-level functions such as inference, abstraction, modelling and causation, being fragile and easily deceived. How is it that some of these approaches have (apparently) been successfully applied? We explore recent attempts to adopt more powerful, albeit more difficult methods, methods based on the theories of computability and algorithmic probability, which may eventually display and grasp these higher level elements of human intelligence. We propose that a fundamental question in science regarding how to find shortcuts for faster adoption of proven mathematical tools can be answered by shortening the adoption cycle and leaving behind old practices in favour of new ones. This is the case for randomness, where science continues to cling to purely statistical tools in disentangling randomness from meaning, and is stuck in a self-deluding pattern of still privileging regression and correlation despite the fact that mathematics has made important advances to better characterise randomness that have yet to be incorporated into scientific theory and practice.
Keywords: Shannon entropy; machine learning; algorithmic complexity; Kolmogorov complexity; feasibility; LZW; causality vs. correlation; Algorithmic Information Dynamics Shannon entropy; machine learning; algorithmic complexity; Kolmogorov complexity; feasibility; LZW; causality vs. correlation; Algorithmic Information Dynamics
MDPI and ACS Style

Zenil, H. Towards Demystifying Shannon Entropy, Lossless Compression and Approaches to Statistical Machine Learning. Proceedings 2020, 47, 24.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop