Next Article in Journal
Measuring Nonclassicality of Mesoscopic Twin-Beam States with Silicon Photomultipliers
Previous Article in Journal
Supporting Data Structures Visualization for Teachers and Learners
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Effects of Neuronal Noise on Neural Communication †

by
Deniz Gençağa
1,* and
Sevgi Şengül Ayan
2,*
1
Department of Electrical and Electronics Engineering, Antalya Bilim University, Antalya 07190, Turkey
2
Department of Industrial Engineering, Antalya Bilim University, Antalya 07190, Turkey
*
Authors to whom correspondence should be addressed.
Presented at the 39th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Garching, Germany, 30 June–5 July 2019.
Proceedings 2019, 33(1), 2; https://doi.org/10.3390/proceedings2019033002
Published: 19 November 2019

Abstract

:
In this work, we propose an approach to better understand the effects of neuronal noise on neural communication systems. Here, we extend the fundamental Hodgkin-Huxley (HH) model by adding synaptic couplings to represent the statistical dependencies among different neurons under the effect of additional noise. We estimate directional information-theoretic quantities, such as the Transfer Entropy (TE), to infer the couplings between neurons under the effect of different noise levels. Based on our computational simulations, we demonstrate that these nonlinear systems can behave beyond our predictions and TE is an ideal tool to extract such dependencies from data.

1. Introduction

Mathematical models and analysis have been a strong tool to answer many important questions in biology and the work of Hodgkin and Huxley on nerve conduction is one of the best examples of it [1]. In 1952, after many years of theoretical and experimental work of physiologists, a mathematical model was proposed by HH to explain the action potential generation of neurons using conductance models that are defined for different electrically excitable cells [2,3,4]. Despite the rapid growth in the number of analyses on the communication between neurons, the noise effect has generally been overlooked in the literature. Recently, neuronal noise effects have started to be incorporated into the models, due to a phenomenon, called “Stochastic Resonance” [5]. The communication between neurons is maintained by electrical signals, called ‘’Action Potentials (AP)”. If the action potentials, as a response to a stimulant, exceeds a certain threshold value, these signals are referred to as “Spikes”. The existence of a spike is determined by the value of a threshold value and additional noise component can easily increase or decrease the value of an AP versus the threshold, thus change the neural spike train code. Therefore, the noise is not merely a nuisance factor and it is capable of changing the meaning of the “neuronal code”. For this reason, to better understand how these changes can occur in a very complex system, such as our brain, we must first understand the underlying working principles of neuronal noise, which sets the framework of our investigations.
Here, we utilize information theory to better understand the effects of neuronal noise on the overall communication. Therefore, we generalize the HH model in such a way that the noise can be added to the system beside the coupling among the neurons. In the literature, the effect of coupling among different neurons have been explored by using TE [6], however, to the best of our knowledge, the effects of noise on these interactions have not been fully considered yet.
On the other hand, certain types of models have been suggested to include the noise in the HH model [7] without any coupling between the neurons. Here, we approach the complicated modeling problem by using a simplified version including two neurons, coupling between them, and additional noise terms. We propose utilizing information theory to analyze the relationships in neural communication.
In the literature, information-theoretic quantities, such as Entropy, Mutual Information (MI) and Transfer Entropy (TE) have been successfully utilized to analyze the statistical dependencies and relationships between random variables of highly complex systems [8]. Among these, MI is a symmetric quantity reporting the dependency between two variables, whereas TE, is an asymmetric quantity that can be used to infer the direction of the interaction (as affecting and affected variables) between them [9]. All the above quantities are calculated from observational data by inferring probability distributions. Despite the wide variety of different distribution estimation techniques, the whole procedure still suffers from adverse effects, such as the bias. Most common techniques in probability distribution estimation involve histograms [10], Parzen windows [11] and adaptive methods [12]. In the literature, histogram estimation is widely used due to its computational simplicity. To rely on estimations from data, reporting the statistical significance of each estimate [13] constitutes an important part of the methods.
In this work, we propose utilizing TE to investigate the directional relationships between the coupled neurons of a HH model under noisy conditions. Therefore, we extend the traditional HH model and analyzed the effect of noise on the directional relationships between the coupled neurons. As our first approach to model noisy neuronal interaction, we demonstrate the effect under certain levels of noise power in the simulations. Based on these simulations, we observe that the original interactions are preserved despite many changes in the structure of the neuronal code structure. Our future work will be based on the generalization of this modeling to consider N neurons and the effect of noise on their interactions.

2. Materials and Methods

2.1. The Hodgkin-Huxley Model

In this study we use Hodgkin-Huxley model which mimics the spiking behavior of the neurons recorded from the squid giant axon. This is the first mathematical model describing the action potential generation and it is one of the major breakthroughs of computational neuroscience [1]. In 1952 two physiologists Hodgkin and Huxley got the Nobel prize after this work and after their work Hodgkin-Huxley type models are defined for many different electrically excitable cells such as cardiomyocytes [2], pancreatic beta cells [3] and hormone secretion [4]. They observed that cell membranes behave much like electrical circuits. The basic circuit elements are the phospholipid bilayer of the cell, which behaves like a capacitor that accumulates ionic charge while the electrical potential across the membrane changes. Moreover, resistors in a circuit are analogue to the ionic permeabilities of the membrane and the electrochemical driving forces are analogous to batteries driving the ionic currents. Na+, K+, Ca2+ and Cl ions are responsible for almost all the electrical actions in the body. Thus, the electrical behavior of cells is based upon the transfer and storage of ions and Hodgkin and Huxley observed that K+ and Na+ ions are mainly responsible for the HH system.
Mathematical description of the Hodgkin-Huxley model starts with the membrane potential V based on the conservation of electric charge defined as follows
C m d V d t = I i o n + I a p p
where C m is the membrane capacitance, I a p p is the applied current and I i o n represents the sum of individual ionic currents and modeled according to Ohm’s Law:
I i o n = g K V V K g N a V V N a g L V V L .
here g K , g N a and g L are conductances, V N a , V K , V L are the reversal potentials associated with the currents. Hodgkin and Huxley observed that conductances are also voltage dependent. They realize that g K depends on four activation gates and defined as g K = g K ¯ n 4 whereas g N a depends on three activation gates and one inactivation gate and modeled as g N a = g N a ¯ m 3 h . In the HH model, ionic currents are defined as:
I N a = g N a ¯ m 3 h V V N a
I K = g K ¯ n 4 V V K
I L = g L ¯ V V L
with Na+ activation variable m and inactivation variable h, and K+ activation variable n. Here ( . ) denotes maximal conductances. Activation and inactivation dynamics of the channels are changing according to the differential equations below.
d m d t = m V i m τ m V i
d h d t = h V i h τ h V i
d n d t = n V i n τ n V i
The steady state activation and inactivation functions together with time constants are defined as below and the transition rates α x   a n d   β x are given in Table 1.
x V i = α x V i α x V + β x V
τ x V i = 1 α x V i + β x V i , x = m , h , n

2.2. Information Theoretic Quantities

In information theory, Shannon entropy is defined to be the average uncertainty for finding the system at a particular state ‘x’ out of a possible set of states ‘X’, where p(x) denotes the probability of that state. Also, it is used to quantify the amount of information needed to describe a dataset. Shannon entropy is given by the following formula
H X = x X p x l o g p x
Mutual information (MI), is another fundamental information-theoretic quantity which is used to quantify the information shared between two datasets. Given two datasets denoted by X and Y, the MI can be written as follows:
M I X , Y = x X y Y p x , y l o g p x , y p x p y
The MI is a symmetric quantity and it can be rewritten as a sum and difference of Shannon entropies by
M I X , Y = H X + H Y H ( X , Y )
where H ( X , Y ) is the joint Shannon entropy. If there is a directional dependency between the variables, such as a cause and effect relationship, a symmetric measure cannot unveil the dependency information from data. In the literature, TE was proposed to analyze the directional dependencies between two Markov processes. To quantify the directional effect of a variable X on Y, the TE is defined by the conditional distribution of Y depending on the past samples of both processes versus the conditional distribution of that variable depending only on its own past values [14]. Thus, the asymmetry of TE helps us detect two directions of information flow. The TE definition in both directions (between variables X and Y) are given by the following equations:
T E X Y = T Y i + 1 Y i ( k ) , X i ( l ) = y i + 1 , y i ( k ) , x i ( l ) p y i + 1 , y i ( k ) , x i ( l ) l o g 2 p y i + 1 | y i ( k ) , x i ( l ) p y i + 1 | y i ( k )
T E Y X = T X i + 1 X i ( k ) , Y i ( l ) = x i + 1 , x i ( k ) , y i ( l ) p x i + 1 , x i ( k ) , y i ( l ) l o g 2 p x i + 1 | x i ( k ) , y i ( l ) p x i + 1 | x i ( k )
where x i ( k ) = x i , , x i k + 1 and y i ( l ) = y i , , y i l + 1 are past states, and X and Y are kth and lth order Markov processes, respectively, such that X depends on the k previous values and Y depends on the l previous values. In the literature, k and l are also known as the embedding dimensions.
All the above quantities involve estimation of probability distributions from the observed data. Among many approaches in the literature, we utilize the histogram-based method to estimate the distributions on (14) and (15), due to its computational simplicity. In order to assess the statistical significance of the TE estimations, surrogate data testing is applied, and the p-values are reported.

2.3. The Proposed Method

In this paper we focus on the system of two coupled HH neurons with synaptic coupling from neuron 1 to neuron 2. Also, current noise is added with normal distribution for the action potential generation of the squid axons for this two-neuron network. It involves a fast sodium current I i , N a , a delayed rectifying potassium current I i , K and a leak current I i , L m e a s u r e d   i n   μ A c m 2   f o r   i = 1,2 . The differential equations for the rate of change of voltage for these neurons are given as follows,
C m d V 1 d t = I i , a p p I i , N a I i , K I i , L + N 0 , σ ,
C m d V 2 d t = I i , a p p I i , N a I i , K I i , L + k ( V 2 V 1 ) + N 0 , σ
where V 1 is the membrane voltage for the 1st neuron and V 2 is the membrane voltage for the 2nd neuron. Here, N ( 0 , σ ) shows the noise distribution defined by normal distribution with 0 mean and σ standart deviation. Synapting coupling is defined simply I s y n = k ( V 2 V 1 ) with voltage difference and synaptic coupling strength is k . When k is between 0 and 0.25, spiking activity occurs with unique stable limit cycle solution. After k = 0.25 system turns back to stable steady state and spiking activity disappears. All other dynamics are same as described in Section 2.1.
First, we propose using TE between V 2   a n d   V 1 , in the case of no noise in (16) and (17). Secondly, we include the noise components in (16) and (17) and utilize TE between V 2 a n d V 1 , again. This comparison demonstrates the effects of noise on the information flow between the neurons. At a first glance on equations (16) and (17), we can conclude that the direction of the information flow under noiseless case must be from V 1 t o V 2 . However, when the noise is added, it is tedius to reach the same conclusion, as the added noise is capable of adding additional spikes and destroying the available ones. The simulation results in the next section demonsrate these findings and provides promising results to generalize our model to more complex neuronal interactions under noise.
The model is implemented in the XPPAUT software [15] using the Euler method (dt = 0.1 ms).

3. Results

Information Flow Changes with Coupling Strength and Noise Level

Here, we first studied a system of two globally coupled HH model through a synapse by varying the coupling strength k, without noise effect. Phase dynamics for our system for two different k coupling strengths are plotted in Figure 1. Here we define two different coupling patterns as shown below. In Figure 1a, neuron 2 fires once after neuron 1 fires twice which we call 2-to-1 coupling with k = 0.1. For a larger coupling coefficient (k = 0.2) neurons shows different synchronous firing pattern as in Figure 1b. This time, each firing of neuron 2 follows that of neuron 1 which we call 1-to-1 coupling.
Here, we first studied a system of two globally coupled HH model through a synapse by varying the coupling strength k, without noise effect. Phase dynamics for our system for two different k coupling strengths are plotted in Figure 1. Here we define two different coupling patterns as shown below. In Figure 1a, neuron 2 fires once after neuron 1 fires twice which we call 2-to-1 coupling with k = 0.1. For a larger coupling coefficient (k = 0.2) neurons shows different synchronous firing pattern as in Figure 1b. This time, each firing of neuron 2 follows that of neuron 1 which we call 1-to-1 coupling.
To better understand the effects of noise on our network, we use zero mean Gaussian distributed random variables with standard deviation of σ . When we incorporate this noise with different variances into our model as illustrated in (16) and (17), we observe a change in the synchronisation of the neurons. Additionally, the obvious patterns disappear totally for larger noise amounts as shown in Figure 2 for each coupled network. Noise can change the synchronization of neurons by inducing or deleting spikes in network. Since the noise plays an important role in changing the dynamics of the network, we need a mechanism to figure out this newly changed patterns under noise effect to explain the behavior of the neuronal network. Therefore, we utilize TE to extract this pattern using the observed voltage data.
To better understand the information flow between the neurons, we estimate TE between the action potentials of two neurons, as described in Section 2.2, using (16) and (17), with an increasing noise intensity. The results are plotted in Figure 3 for both 2-to-1 coupled system and 1-to-1 coupled system. As expected for the network without noise ( σ = 0), transfer entropy value is the highest for both systems. This verifies the changes caused in Neuron 2 by Neuron 1.
To see the effects of noise, we explore the TE between the neurons with increasing σ parameter. From Figure 3, we note that with the increasing noise intensity, the values of TE decrease. Although this can be expected, it is of utmost importance to emphasize the case when we do not have any noise. If we do not have noise component in the model, according to Figure 3 and Table 2, we notice that TE value is behaving in opposite way, i.e., the smaller TE, the higher coupling. Another interesting finding is the varying pattern in TE around low σ values: The TE increases first and keeps decreasing later. This unexpected result shows that we cannot easily predict the direction of coupling without TE analysis, as the synaptic couplings are nonlinear in nature.

4. Discussion and Conclusions

We study information flow for the coupled network of two neurons under two different coupling states and increasing noise levels, where the neuron models and the synaptic interactions are derived from Hodgkin-Huxley model. Here we propose our model in such a way that we can generate 2-to-1 coupling and 1-to-1 coupling between the neurons. In order to find these relationships from data we propose a TE based approach and analyze the effects of couplings under various noise intensities, successfully. These results help us better understand the interaction between the neurons in real biological systems. This work is of particular importance to explore larger networks with more complex noisy interactions.

Author Contributions

D.G. and S.Ş.A. designed the studies; D.G. performed transfer entropy simulations; S.Ş.A. performed mathematical modelling; D.G. and S.Ş.A. analyzed data; D.G. and S.Ş.A. wrote the paper. All authors have seen and approved the final version of the manuscript.

Funding

This research is funded by TUBITAK 1001 grant number 118E765.

Acknowledgments

We thank graduate student Lemayian Joelponcha for performing some sections of the simulations.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hodgkin, A.L.; Huxley, A.F. A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 1952, 117, 500–544. [Google Scholar] [CrossRef] [PubMed]
  2. Pandit, S.V.; Clark, R.B.; Giles, W.R.; Demir, S.S. A Mathematical Model of Action Potential Heterogeneity in Adult Rat Left Ventricular Myocytes. Biophys. J. 2001, 81, 3029–3051. [Google Scholar] [CrossRef] [PubMed]
  3. Bertram, R.; Sherman, A. A calcium-based phantom bursting model for pancreatic islets. Bull. Math. Biol. 2004, 66, 1313–1344. [Google Scholar] [CrossRef] [PubMed]
  4. Duncan, P.J.; Şengül, S.; Tabak, J.; Ruth, P.; Bertram, R.; Shipston, M.J. Large conductance Ca2+-activated K+ (BK) channels promote secretagogue-induced transition from spiking to bursting in murine anterior pituitary corticotrophs. J. Physiol. 2015, 593, 1197–1211. [Google Scholar] [CrossRef] [PubMed]
  5. Moss, F.; Ward, L.M.; Sannita, W.G. Stochastic resonance and sensory information processing: a tutorial and review of application. Clin. Neurophysiol. 2004, 115, 267–281. [Google Scholar] [CrossRef] [PubMed]
  6. Li, Z.; Li, X. Estimating temporal causal interaction between spike trains with permutation and transfer entropy. PLoS ONE 2013, 8, e70894. [Google Scholar] [CrossRef] [PubMed]
  7. Goldwyn, J.H.; Shea-Brown, E. The what and where of adding channel noise to the Hodgkin-Huxley equations. PLoS Comput. Biol. 2011, 7, e1002247. [Google Scholar] [CrossRef] [PubMed]
  8. Gencaga, D. Transfer Entropy (Entropy Special Issue Reprint); MDPI: Basel, Switzerland, 2018. [Google Scholar]
  9. Gencaga, D.; Knuth, K.H.; Rossow, W.B. A Recipe for the Estimation of Information Flow in a Dynamical System. Entropy 2015, 17, 438–470. [Google Scholar] [CrossRef]
  10. Knuth, K.H. Optimal data-based binning for histograms. arXiv 2006, arXiv:physics/0605197. [Google Scholar]
  11. Scott, D.W. Multivariate Density Estimation: Theory, Practice, and Visualization, 2nd ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2015. [Google Scholar]
  12. Darbellay, G.A.; Vajda, I. Estimation of the information by an adaptive partitioning of the observation space. IEEE Trans. Inf. Theory 1999, 45, 1315–1321. [Google Scholar] [CrossRef]
  13. Timme, N.M.; Lapish, C.C. A tutorial for information theory in neuroscience. eNeuro 2018, 5. [Google Scholar] [CrossRef] [PubMed]
  14. Schreiber, T. Measuring information transfer. Phys. Rev. Lett. 2000, 85, 461–464. [Google Scholar] [CrossRef] [PubMed]
  15. XPP-AUT Software. Available online: http://www.math.pitt.edu/~bard/xpp/xpp.html (accessed on 30 June 19).
Figure 1. Sample spike patterns for two different network configurations: (a) 2 to 1 coupling and (b) 1 to 1 coupling.
Figure 1. Sample spike patterns for two different network configurations: (a) 2 to 1 coupling and (b) 1 to 1 coupling.
Proceedings 33 00002 g001
Figure 2. Noise changes the synchronization of the network.
Figure 2. Noise changes the synchronization of the network.
Proceedings 33 00002 g002
Figure 3. Transfer entropy results for 2-to-1 and 1-to-1 coupled HH network.
Figure 3. Transfer entropy results for 2-to-1 and 1-to-1 coupled HH network.
Proceedings 33 00002 g003
Table 1. Transition rates and parameter values for the HH Model.
Table 1. Transition rates and parameter values for the HH Model.
Transition Rates (ms−1) α m 0.1 ( 40 + V i ) / ( 1   e x p ( ( 55 + V i ) / 10 )
β m 4   exp ( ( 65 + V i ) / 18 )
α h 0.07   exp ( ( 65 + V i ) / 20 )
β h 1 / ( 1 + e x p ( 35 + V i )
α n 0.01   ( 55 + V i ) / ( 1 e x p ( ( 10 + 55 ) )
β n 0.125   exp ( ( V i + 65 ) / 80 )
Parameter Values C m 1   μ F
I a p p 8   m A
g N a 120   μ S
g K 36   μ S
g L 0.3   μ S
V N a 50   m V
V K 77   m V
V L 54.4   m V
Table 2. Transfer entropy values with different noise intensity as σ .
Table 2. Transfer entropy values with different noise intensity as σ .
σ
0
0.5
1
2
3
5
6
7
8
9
10
K = 0.10.06650.0170.01680.01690.01290.00590.00390.00140.00130.0008060.000512
K = 0.20.05570.02640.02060.02610.02420.01650.01480.00950.00640.00540.0033
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gençağa, D.; Ayan, S.Ş. Effects of Neuronal Noise on Neural Communication. Proceedings 2019, 33, 2. https://doi.org/10.3390/proceedings2019033002

AMA Style

Gençağa D, Ayan SŞ. Effects of Neuronal Noise on Neural Communication. Proceedings. 2019; 33(1):2. https://doi.org/10.3390/proceedings2019033002

Chicago/Turabian Style

Gençağa, Deniz, and Sevgi Şengül Ayan. 2019. "Effects of Neuronal Noise on Neural Communication" Proceedings 33, no. 1: 2. https://doi.org/10.3390/proceedings2019033002

Article Metrics

Back to TopTop