Next Article in Journal
K-th Nearest Neighbor (KNN) Entropy Estimates of Complexity and Integration from Ongoing and Stimulus-Evoked Electroencephalographic (EEG) Recordings of the Human Brain
Previous Article in Journal
Improving the Robustness of Entangled States by Basis Transformation
Article Menu
Issue 1 (January) cover image

Export Article

Open AccessArticle
Entropy 2019, 21(1), 60; https://doi.org/10.3390/e21010060

From Learning to Consciousness: An Example Using Expected Float Entropy Minimisation

Mathematical Institute, University of Oxford, Oxford OX2 6GG, UK
Received: 12 November 2018 / Revised: 4 January 2019 / Accepted: 4 January 2019 / Published: 13 January 2019
(This article belongs to the Section Complexity)
PDF [1468 KB, uploaded 13 January 2019]

Abstract

Over recent decades several mathematical theories of consciousness have been put forward including Karl Friston’s Free Energy Principle and Giulio Tononi’s Integrated Information Theory. In this article we further investigate theory based on Expected Float Entropy (EFE) minimisation which has been around since 2012. EFE involves a version of Shannon Entropy parameterised by relationships. It turns out that, for systems with bias due to learning, certain choices for the relationship parameters are isolated since giving much lower EFE values than others and, hence, the system defines relationships. It is proposed that, in the context of all these relationships, a brain state acquires meaning in the form of the relational content of the associated experience. EFE minimisation is itself an association learning process and its effectiveness as such is tested in this article. The theory and results are consistent with the proposition of there being a close connection between association learning processes and the emergence of consciousness. Such a theory may explain how the brain defines the content of consciousness up to relationship isomorphism.
Keywords: float entropy; consciousness and relationships; typical data; structures implied by neural networks float entropy; consciousness and relationships; typical data; structures implied by neural networks
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Mason, J.W.D. From Learning to Consciousness: An Example Using Expected Float Entropy Minimisation. Entropy 2019, 21, 60.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top