E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Entropy in Human Brain Networks"

Quicklinks

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Complexity".

Deadline for manuscript submissions: closed (31 July 2015)

Special Issue Editor

Guest Editor
Prof. Dr. Wassim M. Haddad

School of Aerospace Engineering, Georgia Institute of Technology, Atlanta, GA, USA
Website | E-Mail
Interests: nonlinear systems; large-scale systems; hierarchical control; robust and adaptive control; impulsive and hybrid systems; system thermodynamics; network systems; system biology, clinical pharmacology; and mathematical neuroscience

Special Issue Information

Dear Colleagues,

An important area of science where a dynamical system framework of thermodynamics can prove invaluable is in neuroscience. Advances in neuroscience have been closely linked to mathematical modeling beginning with the integrate-and-fire model of Lapicque and proceeding through the modeling of the action potential by Hodgkin and Huxley to the current era of mathematical neuroscience. Neuroscience has always had models to interpret experimental results from a high-level complex systems perspective; however, expressing these models with dynamic equations rather than words fosters precision, completeness, and self-consistency. Nonlinear dynamical system theory, and in particular system thermodynamics, is ideally suited for rigorously describing the behavior of large-scale networks of neurons.

Merging the two universalisms of thermodynamics and dynamical systems theory with neuroscience can provide the theoretical foundation for understanding the network properties of the brain by rigorously addressing large-scale interconnected biological neuronal network models that govern the neuroelectronic behavior of biological excitatory and inhibitory neuronal networks. As in thermodynamics, neuroscience is a theory of large-scale systems wherein graph theory can be a very useful tool in capturing the connectivity properties of system interconnections, with neurons represented by nodes, synapses represented by edges or arcs, and synaptic efficacy captured by edge weighting giving rise to a weighted adjacency matrix governing the underlying directed graph network topology. However, unlike thermodynamics, wherein energy spontaneously flows from a state of higher temperature to a state of lower temperature, neuron membrane potential variations occur due to ion species exchanges which evolve from regions of higher concentrations to regions of lower concentrations. And this evolution does not occur spontaneously but rather requires the opening and closing of specific gates within specific ion channels. The purpose of this special issue is to use a dynamical systems framework merged with thermodynamic state notions (i.e., entropy, energy, free energy, chemical potential, etc.) to provide a fundamental understanding of the networks properties of the brain.

Prof. Dr. Wassim M. Haddad
Guest Editor

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1400 CHF (Swiss Francs).


Keywords

  • biological neural networks
  • spiking and firing rate models
  • dissipative nonlinear dynamical systems
  • fractal variability and chaos
  • interacting solitons
  • theorems addressing stability in nonlinear networks
  • cyclo-dissipative systems
  • computational neuroscience
  • spatio-temporal networks
  • bistability and Multistability theory
  • mathematical neuroscience
  • network systems
  • cellular control systems
  • uncertain dynamical networks
  • complex dynamical systems
  • stochastic networks and ergodic systems
  • system thermodynamics
  • large-scale interconnected systems
  • synchronization of biological networks
  • arrow of time and the conscious brain
  • brain network dynamics
  • cortical field theory
  • neurodynamics of consciousness
  • neurodynamics of attention
  • cortical modeling
  • neural coding
  • theoretical neuroscience
  • neurophysiology of consciousness
  • thermodynamics of the human brain

Published Papers (9 papers)

View options order results:
result details:
Displaying articles 1-9
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle Single to Two Cluster State Transition of Primary Motor Cortex 4-posterior (MI-4p) Activities in Humans
Entropy 2015, 17(11), 7596-7607; doi:10.3390/e17117596
Received: 11 September 2015 / Accepted: 30 October 2015 / Published: 3 November 2015
PDF Full-text (1432 KB) | HTML Full-text | XML Full-text
Abstract
The human primary motor cortex has dual representation of the digits, namely, area 4 anterior (MI-4a) and area 4 posterior (MI-4p). We have previously demonstrated that activation of these two functional subunits can be identified independently by functional magnetic resonance imaging (fMRI) using
[...] Read more.
The human primary motor cortex has dual representation of the digits, namely, area 4 anterior (MI-4a) and area 4 posterior (MI-4p). We have previously demonstrated that activation of these two functional subunits can be identified independently by functional magnetic resonance imaging (fMRI) using independent component-cross correlation-sequential epoch (ICS) analysis. Subsequent studies in patients with hemiparesis due to subcortical lesions and monoparesis due to peripheral nerve injury demonstrated that MI-4p represents the initiation area of activation, whereas MI-4a is the secondarily activated motor cortex requiring a “long-loop” feedback input from secondary motor systems, likely the cerebellum. A dynamic model of hand motion based on the limit cycle oscillator predicts that the specific pattern of entrainment of neural firing may occur by applying appropriate periodic stimuli. Under normal conditions, such entrainment introduces a single phase-cluster. Under pathological conditions where entrainment stimuli have insufficient strength, the phase cluster splits into two clusters. Observable physiological phenomena of this shift from single cluster to two clusters are: doubling of firing rate of output neurons; or decay in group firing density of the system due to dampening of odd harmonics components. While the former is not testable in humans, the latter can be tested by appropriately designed fMRI experiments, the quantitative index of which is believed to reflect group behavior of neurons functionally localized, e.g., firing density in the dynamic theory. Accordingly, we performed dynamic analysis of MI-4p activation in normal volunteers and paretic patients. The results clearly indicated that MI-4p exhibits a transition from a single to a two phase-cluster state which coincided with loss of MI-4a activation. The study demonstrated that motor dysfunction (hemiparesis) in patients with a subcortical infarct is not simply due to afferent fiber disruption. Maintaining proper afferent signals from MI-4p requires proper functionality of MI-4a and, hence, appropriate feedback signals from the secondary motor system. Full article
(This article belongs to the Special Issue Entropy in Human Brain Networks)
Open AccessArticle Dynamical Change of Signal Complexity in the Brain During Inhibitory Control Processes
Entropy 2015, 17(10), 6834-6853; doi:10.3390/e17106834
Received: 15 May 2015 / Revised: 21 September 2015 / Accepted: 6 October 2015 / Published: 9 October 2015
PDF Full-text (6894 KB) | HTML Full-text | XML Full-text
Abstract
The ability to inhibit impulses and withdraw certain responses are essential for human’s survival in a fast-changing environment. These processes happen fast, in a complex manner, and require our brain to make a fast adaptation to inhibit the impulsive response. The present study
[...] Read more.
The ability to inhibit impulses and withdraw certain responses are essential for human’s survival in a fast-changing environment. These processes happen fast, in a complex manner, and require our brain to make a fast adaptation to inhibit the impulsive response. The present study employs multiscale entropy (MSE) to analyzing electroencephalography (EEG) signals acquired alongside a behavioral stop-signal task to theoretically quantify the complexity (indicating adaptability and efficiency) of neural systems to investigate the dynamical change of complexity in the brain during the processes of inhibitory control. We found that the complexity of EEG signals was higher for successful than unsuccessful inhibition in the stage of peri-stimulus, but not in the pre-stimulus time window. In addition, we found that the dynamical change in the brain from pre-stimulus to peri-stimulus stage for inhibitory control is a process of decreasing complexity. We demonstrated both by sensor-level and source-level MSE that the processes of losing complexity is temporally slower and spatially restricted for successful inhibition, and is temporally quicker and spatially extensive for unsuccessful inhibition. Full article
(This article belongs to the Special Issue Entropy in Human Brain Networks)
Open AccessArticle Differentiating Interictal and Ictal States in Childhood Absence Epilepsy through Permutation Rényi Entropy
Entropy 2015, 17(7), 4627-4643; doi:10.3390/e17074627
Received: 31 March 2015 / Revised: 18 June 2015 / Accepted: 25 June 2015 / Published: 2 July 2015
Cited by 1 | PDF Full-text (1470 KB) | HTML Full-text | XML Full-text
Abstract
Permutation entropy (PE) has been widely exploited to measure the complexity of the electroencephalogram (EEG), especially when complexity is linked to diagnostic information embedded in the EEG. Recently, the authors proposed a spatial-temporal analysis of the EEG recordings of absence epilepsy patients based
[...] Read more.
Permutation entropy (PE) has been widely exploited to measure the complexity of the electroencephalogram (EEG), especially when complexity is linked to diagnostic information embedded in the EEG. Recently, the authors proposed a spatial-temporal analysis of the EEG recordings of absence epilepsy patients based on PE. The goal here is to improve the ability of PE in discriminating interictal states from ictal states in absence seizure EEG. For this purpose, a parametrical definition of permutation entropy is introduced here in the field of epileptic EEG analysis: the permutation Rényi entropy (PEr). PEr has been extensively tested against PE by tuning the involved parameters (order, delay time and alpha). The achieved results demonstrate that PEr outperforms PE, as there is a statistically-significant, wider gap between the PEr levels during the interictal states and PEr levels observed in the ictal states compared to PE. PEr also outperformed PE as the input to a classifier aimed at discriminating interictal from ictal states. Full article
(This article belongs to the Special Issue Entropy in Human Brain Networks)
Open AccessArticle Symbolic Entropy of the Amplitude rather than the Instantaneous Frequency of EEG Varies in Dementia
Entropy 2015, 17(2), 560-579; doi:10.3390/e17020560
Received: 17 October 2014 / Accepted: 26 January 2015 / Published: 29 January 2015
PDF Full-text (908 KB) | HTML Full-text | XML Full-text
Abstract
The dynamics of human electroencephalography (EEG) have been proved to be related to cognitive activities. This study separately assessed the two EEG components, amplitude and rhythm, aiming to capture their individual contributions to cognitive functions. We extracted the local peaks of EEGs under
[...] Read more.
The dynamics of human electroencephalography (EEG) have been proved to be related to cognitive activities. This study separately assessed the two EEG components, amplitude and rhythm, aiming to capture their individual contributions to cognitive functions. We extracted the local peaks of EEGs under rest or photic stimulation and calculated the symbolic dynamics of their voltages (amplitude) and interpeak intervals (instantaneous frequency), individually. The sample consisted of 89 geriatric outpatients in three patient groups: 38 fresh cases of vascular dementia (VD), 22 fresh cases of Alzheimer’s disease (AD) and 29 controls. Both sample entropy and number of forbidden words revealed significantly less regular symbolic dynamics in the whole EEG tracings of the VD than the AD and control groups. We found consistent results between groups with the symbolic dynamics in the local-peak voltage sequence rather than the interpeak interval sequence. Photic stimulation amplified the differences between groups. These results suggest that the EEG dynamics which relates to either cognitive functions or the underlying pathologies of dementia are embedded within the dynamics of the amount of but not the interval between each synchronized firing of adjacent cerebral neurons. Full article
(This article belongs to the Special Issue Entropy in Human Brain Networks)
Open AccessArticle The Effects of Spontaneous Random Activity on Information Transmission in an Auditory Brain Stem Neuron Model
Entropy 2014, 16(12), 6654-6666; doi:10.3390/e16126654
Received: 30 April 2014 / Revised: 8 December 2014 / Accepted: 15 December 2014 / Published: 19 December 2014
PDF Full-text (312 KB) | HTML Full-text | XML Full-text
Abstract
This paper presents the effects of spontaneous random activity on information transmission in an auditory brain stem neuron model. In computer simulations, the supra-threshold synaptic current stimuli ascending from auditory nerve fibers (ANFs) were modeled by a filtered inhomogeneous Poisson process modulated by
[...] Read more.
This paper presents the effects of spontaneous random activity on information transmission in an auditory brain stem neuron model. In computer simulations, the supra-threshold synaptic current stimuli ascending from auditory nerve fibers (ANFs) were modeled by a filtered inhomogeneous Poisson process modulated by sinusoidal functions at a frequency of 220–3520 Hz with regard to the human speech spectrum. The stochastic sodium and stochastic high- and low-threshold potassium channels were incorporated into a single compartment model of the soma in spherical bushy neurons, so as to realize threshold fluctuations or a variation of spike firing times. The results show that the information rates estimated from the entropy of inter-spike intervals of spike trains tend toward a convex function of the spontaneous rates when the intensity of sinusoidal functions decreases. Furthermore, the results show that a convex function of the spontaneous rates tends to disappear as the frequency of the sinusoidal function increases, such that the phase-locked response can be unobserved. It is concluded that this sort of stochastic resonance (SR) phenomenon, which depends on the spontaneous rates with supra-threshold stimuli, can better enhance information transmission in a smaller intensity of sinusoidal functions within the human speech spectrum, like the situation in which the regular SR can enhance weak signals. Full article
(This article belongs to the Special Issue Entropy in Human Brain Networks)
Open AccessArticle Fractal Structure and Entropy Production within the Central Nervous System
Entropy 2014, 16(8), 4497-4520; doi:10.3390/e16084497
Received: 21 May 2014 / Revised: 8 July 2014 / Accepted: 28 July 2014 / Published: 12 August 2014
Cited by 8 | PDF Full-text (729 KB) | HTML Full-text | XML Full-text
Abstract
Our goal is to explore the relationship between two traditionally unrelated concepts, fractal structure and entropy production, evaluating both within the central nervous system (CNS). Fractals are temporal or spatial structures with self-similarity across scales of measurement; whereas entropy production represents the necessary
[...] Read more.
Our goal is to explore the relationship between two traditionally unrelated concepts, fractal structure and entropy production, evaluating both within the central nervous system (CNS). Fractals are temporal or spatial structures with self-similarity across scales of measurement; whereas entropy production represents the necessary exportation of entropy to our environment that comes with metabolism and life. Fractals may be measured by their fractal dimension; and human entropy production may be estimated by oxygen and glucose metabolism. In this paper, we observe fractal structures ubiquitously present in the CNS, and explore a hypothetical and unexplored link between fractal structure and entropy production, as measured by oxygen and glucose metabolism. Rapid increase in both fractal structures and metabolism occur with childhood and adolescent growth, followed by slow decrease during aging. Concomitant increases and decreases in fractal structure and metabolism occur with cancer vs. Alzheimer’s and multiple sclerosis, respectively. In addition to fractals being related to entropy production, we hypothesize that the emergence of fractal structures spontaneously occurs because a fractal is more efficient at dissipating energy gradients, thus maximizing entropy production. Experimental evaluation and further understanding of limitations and necessary conditions are indicated to address broad scientific and clinical implications of this work. Full article
(This article belongs to the Special Issue Entropy in Human Brain Networks)
Open AccessArticle Human Brain Networks: Spiking Neuron Models, Multistability, Synchronization, Thermodynamics, Maximum Entropy Production, and Anesthetic Cascade Mechanisms
Entropy 2014, 16(7), 3939-4003; doi:10.3390/e16073939
Received: 6 May 2014 / Revised: 19 June 2014 / Accepted: 3 July 2014 / Published: 17 July 2014
Cited by 3 | PDF Full-text (1906 KB) | HTML Full-text | XML Full-text
Abstract
Advances in neuroscience have been closely linked to mathematical modeling beginning with the integrate-and-fire model of Lapicque and proceeding through the modeling of the action potential by Hodgkin and Huxley to the current era. The fundamental building block of the central nervous system,
[...] Read more.
Advances in neuroscience have been closely linked to mathematical modeling beginning with the integrate-and-fire model of Lapicque and proceeding through the modeling of the action potential by Hodgkin and Huxley to the current era. The fundamental building block of the central nervous system, the neuron, may be thought of as a dynamic element that is “excitable”, and can generate a pulse or spike whenever the electrochemical potential across the cell membrane of the neuron exceeds a threshold. A key application of nonlinear dynamical systems theory to the neurosciences is to study phenomena of the central nervous system that exhibit nearly discontinuous transitions between macroscopic states. A very challenging and clinically important problem exhibiting this phenomenon is the induction of general anesthesia. In any specific patient, the transition from consciousness to unconsciousness as the concentration of anesthetic drugs increases is very sharp, resembling a thermodynamic phase transition. This paper focuses on multistability theory for continuous and discontinuous dynamical systems having a set of multiple isolated equilibria and/or a continuum of equilibria. Multistability is the property whereby the solutions of a dynamical system can alternate between two or more mutually exclusive Lyapunov stable and convergent equilibrium states under asymptotically slowly changing inputs or system parameters. In this paper, we extend the theory of multistability to continuous, discontinuous, and stochastic nonlinear dynamical systems. In particular, Lyapunov-based tests for multistability and synchronization of dynamical systems with continuously differentiable and absolutely continuous flows are established. The results are then applied to excitatory and inhibitory biological neuronal networks to explain the underlying mechanism of action for anesthesia and consciousness from a multistable dynamical system perspective, thereby providing a theoretical foundation for general anesthesia using the network properties of the brain. Finally, we present some key emergent properties from the fields of thermodynamics and electromagnetic field theory to qualitatively explain the underlying neuronal mechanisms of action for anesthesia and consciousness. Full article
(This article belongs to the Special Issue Entropy in Human Brain Networks)
Open AccessArticle Searching for Conservation Laws in Brain Dynamics—BOLD Flux and Source Imaging
Entropy 2014, 16(7), 3689-3709; doi:10.3390/e16073689
Received: 1 May 2014 / Revised: 26 June 2014 / Accepted: 27 June 2014 / Published: 3 July 2014
PDF Full-text (4166 KB) | HTML Full-text | XML Full-text
Abstract
Blood-oxygen-level-dependent (BOLD) imaging is the most important noninvasive tool to map human brain function. It relies on local blood-flow changes controlled by neurovascular coupling effects, usually in response to some cognitive or perceptual task. In this contribution we ask if the spatiotemporal dynamics
[...] Read more.
Blood-oxygen-level-dependent (BOLD) imaging is the most important noninvasive tool to map human brain function. It relies on local blood-flow changes controlled by neurovascular coupling effects, usually in response to some cognitive or perceptual task. In this contribution we ask if the spatiotemporal dynamics of the BOLD signal can be modeled by a conservation law. In analogy to the description of physical laws, which often can be derived from some underlying conservation law, identification of conservation laws in the brain could lead to new models for the functional organization of the brain. Our model is independent of the nature of the conservation law, but we discuss possible hints and motivations for conservation laws. For example, globally limited blood supply and local competition between brain regions for blood might restrict the large scale BOLD signal in certain ways that could be observable. One proposed selective pressure for the evolution of such conservation laws is the closed volume of the skull limiting the expansion of brain tissue by increases in blood volume. These ideas are demonstrated on a mental motor imagery fMRI experiment, in which functional brain activation was mapped in a group of volunteers imagining themselves swimming. In order to search for local conservation laws during this complex cognitive process, we derived maps of quantities resulting from spatial interaction of the BOLD amplitudes. Specifically, we mapped fluxes and sources of the BOLD signal, terms that would appear in a description by a continuity equation. Whereas we cannot present final answers with the particular analysis of this particular experiment, some results seem to be non-trivial. For example, we found that during task the group BOLD flux covered more widespread regions than identified by conventional BOLD mapping and was always increasing during task. It is our hope that these results motivate more work towards the search for conservation laws in neuroimaging experiments or at least towards imaging procedures based on spatial interactions of signals. The payoff could be new models for the dynamics of the healthy brain or more sensitive clinical imaging approaches, respectively. Full article
(This article belongs to the Special Issue Entropy in Human Brain Networks)

Review

Jump to: Research

Open AccessReview Applying Information Theory to Neuronal Networks: From Theory to Experiments
Entropy 2014, 16(11), 5721-5737; doi:10.3390/e16115721
Received: 3 June 2014 / Revised: 27 July 2014 / Accepted: 28 October 2014 / Published: 3 November 2014
Cited by 2 | PDF Full-text (992 KB) | HTML Full-text | XML Full-text
Abstract
Information-theory is being increasingly used to analyze complex, self-organizing processes on networks, predominantly in analytical and numerical studies. Perhaps one of the most paradigmatic complex systems is a network of neurons, in which cognition arises from the information storage, transfer, and processing among
[...] Read more.
Information-theory is being increasingly used to analyze complex, self-organizing processes on networks, predominantly in analytical and numerical studies. Perhaps one of the most paradigmatic complex systems is a network of neurons, in which cognition arises from the information storage, transfer, and processing among individual neurons. In this article we review experimental techniques suitable for validating information-theoretical predictions in simple neural networks, as well as generating new hypotheses. Specifically, we focus on techniques that may be used to measure both network (microcircuit) anatomy as well as neuronal activity simultaneously. This is needed to study the role of the network structure on the emergent collective dynamics, which is one of the reasons to study the characteristics of information processing. We discuss in detail two suitable techniques, namely calcium imaging and the application of multi-electrode arrays to simple neural networks in culture, and discuss their advantages and limitations in an accessible manner for non-experts. In particular, we show that each technique induces a qualitatively different type of error on the measured mutual information. The ultimate goal of this work is to bridge the gap between theorists and experimentalists in their shared goal of understanding the behavior of networks of neurons. Full article
(This article belongs to the Special Issue Entropy in Human Brain Networks)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top