Next Article in Journal
Node-Wise Monotone Barrier Coupling Law for Formation Control
Next Article in Special Issue
Temporal Properties of Self-Prioritization
Previous Article in Journal
Quantum State Reduction of General Initial States through Spontaneous Unitarity Violation
Previous Article in Special Issue
Mental Gravity: Depression as Spacetime Curvature of the Self, Mind, and Brain
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Mechanistic Model of Perceptual Binding Predicts That Binding Mechanism Is Robust against Noise

Division of Systems Biology, Academy of Integrated Science, Virginia Polytechnic Institute and State University, Blacksburg, VA 24061, USA
Entropy 2024, 26(2), 133; https://doi.org/10.3390/e26020133
Submission received: 11 December 2023 / Revised: 28 January 2024 / Accepted: 30 January 2024 / Published: 31 January 2024
(This article belongs to the Special Issue Temporo-Spatial Theory of Consciousness (TTC))

Abstract

:
The concept of the brain’s own time and space is central to many models and theories that aim to explain how the brain generates consciousness. For example, the temporo-spatial theory of consciousness postulates that the brain implements its own inner time and space for conscious processing of the outside world. Furthermore, our perception and cognition of time and space can be different from actual time and space. This study presents a mechanistic model of mutually connected processes that encode phenomenal representations of space and time. The model is used to elaborate the binding mechanism between two sets of processes representing internal space and time, respectively. Further, a stochastic version of the model is developed to investigate the interplay between binding strength and noise. Spectral entropy is used to characterize noise effects on the systems of interacting processes when the binding strength between them is varied. The stochastic modeling results reveal that the spectral entropy values for strongly bound systems are similar to those for weakly bound or even decoupled systems. Thus, the analysis performed in this study allows us to conclude that the binding mechanism is noise-resilient.

1. Introduction

A mechanism that provides a unified conscious representation of a scene that is characterized by different perceptual features is known as perceptual binding [1,2,3]. Thus, the primary function of the binding mechanism is to unify the sensory information processed in different parts of the brain to give us a unitary conscious experience of an object or scene. Several mechanisms have been proposed to solve the binding problem. Temporal neuronal synchrony models propose that different perceptual features are bound together when the firing activities of neurons processing these features are synchronized [2,3,4,5]. Similarly, the temporo-spatial theory of consciousness (TTC) suggests that temporal alignment permits binding between a stimulus and ongoing spontaneous neural activity [6,7]. Operational Architectonics suggests that binding is achieved with operational synchrony among neuronal processes occurring in different brain regions [8,9]. Alternatives to temporal synchrony have also been proposed [10]. In this work, a stochastic mechanistic model of binding is developed and presented based on my previous works [11,12]. The model enables quantification of the interplay between noise and binding.
Noise in neurons may generate significant fluctuations in neuronal responses [13,14], yet sensory features represented by neuronal circuits remain stable [15]. For example, noise affects neuronal signals transmitted by the sensory-motor system [14], operation of voltage-gated channels [16,17], synaptic activity [18,19], potential differences across nerve cell membranes [20], propagation of action potentials [21], and spike train coding [22]. Furthermore, noise can change information processing in sub-threshold periodic signals by helping these signals cross the threshold. Such noise-induced transmission of information has been detected in sensory neurons [23] and mechanoreceptor cells [24,25]. The information capacity of neuronal networks also depends on noise [26]. Additive noise can increase the mutual information of threshold neurons [26,27,28]. Nevertheless, very little is known about how phenomenal states and perceptual binding remain robust against noise despite ubiquitous noise sources in neural circuits. In my previous work, I showed that entropy decreases with an increase in the size of a network that contains negative feedback loops interconnecting processes [12]. In this study, I investigate the interplay between noise and binding strength using the same framework, in which bound phenomenal states are encoded in relationships among processes. A phenomenal state (quale) is postulated to be a dynamic property of running processes, and is isomorphic to the executed relationships among the processes [29,30].
Other theories of consciousness have also postulated that the emergence of a conscious experience is associated with a specific action or execution performed by the brain. The theory of neuronal group selection (TNGS) postulates that qualia are high-dimensional discriminations of specific conscious scenes among a vast repertoire of different possible conscious scenes [31], and differences in qualia are determined by differences in neural structure and dynamics. Similarly, according to the integrated information theory (IIT), qualia arise from the reduction of uncertainty when a particular conscious state occurs out of a repertoire of alternative states [32]. Complex systems with larger numbers of possible states generate more information by reducing uncertainty and, thus, generate complex and vivid conscious experiences. For example, consider the conscious experience of a spatial position of a point-like object (i.e., without shape or any other features except the location) in empty space. According to IIT, the conscious experience of the point location occurs when the brain reduces uncertainty by ruling out all possible different positions of that point in space. However, within this framework, it is not clear how the reduction of uncertainty can reoccur contentiously in time when the phenomenal state is retained in consciousness over time. By contrast, per the dynamical framework presented in my study, the continuous execution of processes is an inherent attribute of the framework. A phenomenal state arises from the execution of relationships among processes and is then isomorphic to the executed relationships. Therefore, the phenomenal state is a dynamic property that exists as long as the execution of this property by the system continues in time. In the above example with the spatial position, the phenomenal state isomorphic to a specific position in space would arise when the relationships among a process assigned to the specific position and other processes assigned to all other possible locations in space are executed. Furthermore, the phenomenal state, in this case, would represent not a single point by itself, but the point within the internal phenomenal space.
In this work, I present a mechanistic model that describes perceptual binding between a system’s encoded space and time, which are isomorphic to Euclidean space and time, respectively. The same framework can also be applied to other examples of perceptual binding [11]. The main goal of this work is to investigate how binding is affected by noise. The implications of noise for the system are quantified using spectral entropy, and the results indicate that the binding mechanism is robust against noise.

2. Materials and Methods

The system of oscillating processes, along with the relationships among processes, are used to represent the physical carrier of phenomenal states. The system’s internal representations of space and time are assumed to be encoded in relationships among the processes that are described by the following variables: P ( t ) = ( p 1 ( t ) ,   p 2 ( t ) ,   ,   p n ( t ) ) and Q ( t ) = ( q 1 ( t ) ,   q 2 ( t ) ,   ,   q m ( t ) ) , where t is regular external time. There are n number of oscillating processes to encode space and m number of processes to encode time. The internal space and time are encoded in the relationships among processes, which describe how the processes influence each other or interact. For the brain or neural networks, such relationships would be set by entrainment with external stimuli that are placed at different spatial positions and act with varying time intervals. I assume that the system of processes is already entrained and, thus, each process has specific relationships to all other processes as P = A P and Q = B Q , where the structure of A and B matrices represents memory, which should be isomorphic to real space and time. The elements of A and B matrices are independent of time. However, the relationships among processes encoded in A and B are continuously executed as all processes continuously oscillate in time. This is an important concept of this framework, where a phenomenal state (quale) is assumed to be a property of a dynamical system, which emerges and exists when that property is realized or “happens”. Therefore, the relationships among processes must be continuously executed, yet the specific relationships among processes must be maintained over time, as long as the experience of the corresponding phenomenal state is unchanged. Although, in general, the relationships among processes can be nonlinear (e.g., the relationships between two processes which are represented by a limit cycle in the phase plane), here, I assume that phenomenal space and time are linear and isomorphic to Euclidean space. Therefore, the Euclidean distance hollow matrices below:
A = 0 ε ( n 1 ) 2 ε ε 0 ( n 2 ) 2 ε ( n 1 ) 2 ε ( n 2 ) 2 ε 0
and   B = 0 α ( m 1 ) 2 α α 0 ( m 2 ) 2 α ( m 1 ) 2 α ( m 2 ) 2 α 0
are used to represent the following relationships among processes: P = A P and Q = B Q . Thus, for the p i and q i components of P and Q , we can also write:
p i = j = 1 n ( i j ) 2 ε p j
and   q i = j = 1 m ( i j ) 2 α q j ,
where ε and α are scaling parameters for the “distance and interval” measures between processes. If the sets of processes P and Q that describe the internal representations of space and time are not coupled, then their dynamics are described by the following systems of ordinary differential equations:
d P d t = A P X + P
d X d t = P                                                                  
d Q d t = B Q Z + Q
d Z d t = Q ,                                                            
The systems of Equations (5)–(8) have the following analytical solutions: P = A P and Q = B Q with oscillating P = K cos λ t + L sin λ t and Q = H cos η t + R sin η t , where K , L ,   H ,   R are sets of amplitude values and λ and η are frequencies. Additionally, X ( t ) = ( x 1 ( t ) ,   x 2 ( t ) ,   ,   x n ( t ) ) and Z ( t ) = ( z 1 ( t ) ,   z 2 ( t ) ,   ,   z m ( t ) ) are sets of auxiliary processes. If the processes representing internal space and time are bound, then Equations (5) and (7) must be coupled by including the terms that describe an interaction between P and Q processes. Different coupling schemes were investigated in my previous study [11]. Here, the number of modeled processes is reduced to simplify the stochastic modeling that is used to investigate noise effects on the coupled systems. A dynamical system that contains a set of two processes, P ( t ) = ( p 1 ( t ) ,   p 2 ( t ) ) representing the internal space bound to two processes, and Q ( t ) = ( q 1 t ,   q 2 t ) representing the internal time, can be described by the following system of coupled equations:
d p 1 d t = ε p 2 p 1 x 1 + ω f 1 ( q 1 ,   q 2 )
d p 2 d t = ε p 1 p 2 x 2 + ω f 2 ( q 1 ,   q 2 )
d x 1 d t = p 1
d x 2 d t = p 2
d q 1 d t = α q 2 q 1 z 1 + ω g 1 ( p 1 ,   p 2 )
d q 2 d t = α q 1 q 2 z 2 + ω g 2 ( p 1 ,   p 2 )
d z 1 d t = q 1
d z 2 d t = q 2
The binding interaction between the ( p 1 ,   p 2 ,   x 1 ,   x 2 ) and q 1 ,   q 2 ,   z 1 ,   z 2 sets of processes is described by f 1 q 1 ,   q 2 , f 1 q 1 ,   q 2 and g 1 p 1 ,   p 2 , g 1 p 1 ,   p 2 functions. Here, the functions are set as: f 1 q 1 ,   q 2 = q 1 , f 2 q 1 ,   q 2 = q 2 , g 1 p 1 ,   p 2 = p 1 ,and g 2 p 1 ,   p 2 = p 2 . This interaction scheme is shown in Figure 1a. The binding strength between the p i and q i processes depends on the parameter ω . The sign of parameter ε determines whether the p 1 and p 2 processes are mutually activating ( ε > 0 ) or inhibiting ( ε < 0 ) each other. Similarly, the sign of parameter α determines whether the q 1 and q 2 processes are mutually activating ( α > 0 ) or inhibiting ( α < 0 ) each other. This interaction scheme with a fixed coupling constant ω = 1 and an alternative wiring— f 1 q 1 ,   q 2 = q 1 q 2 , f 2 q 1 ,   q 2 = q 2 q 1 , g 1 p 1 ,   p 2 = p 2 p 1 , and g 2 p 1 ,   p 2 = p 1 p 2 —were investigated in my previous study [11]. In this work, the dynamic behavior of the system is analyzed as a function of coupling strength parameter ω . As an example, numerical solutions of Equations (9)–(16) for the P ( t ) = ( p 1 ( t ) ,   p 2 ( t ) ) and Q ( t ) = ( q 1 t ,   q 2 t ) processes obtained for three different binding strength parameter values ( ω = 0.1, 0.5, and 1) are shown in Figure 1b–d.
Next, the system of Equations (9)–(16) is converted into a stochastic model using Gillespie’s method. For the system S = ( p 1 ,   p 2 ,   x 1 ,   x 2 ,   q 1 ,   q 2 ,   z 1 ,   z 2 ) , the states are updated using the following general Gillespie’s scheme [33]:
  • Initialize the process state vector, S , and set the initial time at 0.
  • Calculate the propensities, a k ( S ) .
  • Generate a uniform random number, r 1 .
  • Compute the time for the next event, τ = 1 k a k S ln r 1 .
  • Generate a uniform random number, r 2 .
  • Find which event is next, I = i , if k = 1 i 1 a k S k a k S r 2 < k = 1 i a k S k a k S
  • Update the state vector, S S + y i .
  • Update time, t t + τ .
  • Repeat steps (2)–(8).
The stochastic model is used to characterize the interplay between the binding strength, ω , and noise. All numerical solutions of Equations (9)–(16) are obtained using XPP/XPPAUT software (http://www.math.pitt.edu/~bard/xpp/xpp.html, accessed on 4 November 2023). The XPP/XPPAUT codes that are sufficient to reproduce all results presented in this work are provided in Appendix A. Code A is used to generate results for the deterministic model described by Equations (9)–(16), and Code B is used to perform stochastic simulations of the model and produce the corresponding model results.
Spectrum analysis and spectral entropy are used to quantify noise effects on the system. Spectral analysis is a common tool in signal processing and in neurophysiological studies [34,35,36,37]. Spectral entropy is based on Shannon’s entropy formalism, which is a foundational concept of information theory [38]. The entropy metric is an important component of the information integration theory of consciousness [39,40]. I have used spectral analysis tools to study noise effects on systems of different sizes, which are described by Equations (5) and (6) [12]. Here, I use the same method to compute spectral entropy for two systems of bound processes (Equations (9)–(16)), in order to characterize the interplay between binding strength and noise.
The spectral entropy value H is computed using the following equation:
H = k j = 1 2048 P S D j ^ L o g 2 ( P S D ^ j ) ,  
where k = 1 L o g 2 2048 0.1 and P S D ^ is the normalized power spectral density that is computed by dividing the power spectral density by the total power [41]. The power spectral density is computed from the fast Fourier transform (FFT) obtained for each process trajectory p i t that is simulated using Code B in Appendix A. The Fourier Analysis function in Excel’s Analysis ToolPak is used to obtain the corresponding signal p i f in the frequency domain. 4096 points are used to compute p i f , which corresponds to a total average simulation time of ~930 arb. u. where the period of oscillations ranges between ~4–7 arb. u. The sampling frequency, f , is obtained by dividing the number of points by the time interval, t . The frequency magnitudes are computed using Excel’s IMABS function. The power spectral density is calculated using the following formula: P S D j = p f j 2 / 2 f . 2048 data points are used to compute spectral densities and the corresponding spectral entropy value from Equation (17). Finally, the coupling parameter ω is varied to characterize the effect of binding strength on the system’s spectral entropy values. For each fixed value of coupling parameter ω , simulations are repeated ten times. Then, those ten spectral entropy values are used to compute the average spectral entropy value and the corresponding standard deviation from the mean.

3. Results

In my previous work [11], the deterministic mathematical model described by the system of Equations (9)–(16) was successfully applied to study the perceptual binding between the location of a stimulus at two possible positions and the presence or absence of a light stimulus at these positions. However, the binding strength ω was assumed to be at its largest value, ω = 1 . It was shown that the system of Equations (9)–(16) exhibits different regimes of modulated oscillations depending on the ε and α parameter values [11]. By contrast, in this study, ε and α are fixed and the binding strength ω is varied. Furthermore, the model is used to describe the possible binding mechanism between encoded space and time. Thus, the model allows one to investigate how the encoding signals may change if the binding strength between the encoded space and time is varied. It is assumed that space and time could be perceived independently as well as together. This assumption follows from the assumption that entrainment of the subsystem that encodes the internal representation of space can be performed either simultaneously with, or independently from, entrainment of the subsystem that encodes the internal representation of time. In addition, stochastic simulations of the model are performed to characterize the robustness of the binding mechanism against noise.
When two oscillatory systems are coupled, they modulate each other and, thus, the oscillatory dynamics of the coupled system are altered. The modulation depends on the parameter ω , that describes the binding strength between two oscillatory systems. This is seen in Figure 1a. Figure 1b–d demonstrate how the binding strength parameter influences the modulation of two coupled oscillatory systems. To characterize the robustness of the coupled system against noise, stochastic simulations are performed. The results of the stochastic model simulations are shown in Figure 2. The stochastic model is simulated using different values of the binding strength parameter ω . Figure 2a,d,g present the stochastic trajectories for the P ( t ) = ( p 1 ( t ) ,   p 2 ( t ) ) and Q ( t ) = ( q 1 t ,   q 2 t ) processes obtained for the binding strength parameter values ω = 0.1, 0.5, and 1. These results can be compared to the numerical results of the deterministic model shown in Figure 1b–d, which are obtained for the same ω parameter values; however, the initial conditions used for the simulation results in Figure 1 and Figure 2 are different. Figure 2b,e,h show distribution histograms for the process p 1 , obtained from trajectories recorded over much larger time frames (>1000 arb. u.) than shown in Figure 2a,d,g. Also, for the p 2 and q 1,2 processes, the corresponding histograms (not shown) appear similar to those shown in Figure 2a,d,g. Normalized power spectral densities are also computed from trajectories, as described in the Methods section. Alteration of the normalized power spectral densities for the process p 1 as a function of varied binding strength parameter values is demonstrated in Figure 2c,f,i. There is a shift of normalized power spectrum peaks to higher frequencies as the binding strength ω increases.
Next, spectral entropy is calculated using the power spectra to quantify sensitivity of the coupled systems of processes to noise. For cases shown in Figure 2, spectral entropy values of ≈0.55, 0.57, and 0.55 are obtained using the power spectra shown in Figure 2c,f,i, with binding strength parameter values ω = 0.1, 0.5, and 1, respectively. Because the spectral entropy values do not change significantly with alteration of the binding strength parameters, it can be concluded that strongly and weakly bound oscillatory systems are equally robust against noise. To test this hypothesis in a more systematic way, fifty independent simulation experiments for the coupled systems of processes are performed, with ten simulation experiments for each of five different values of the binding strength parameter: ω = 0, 0.25, 0.5, 0.75, and 1. In each independent simulation, the spectral entropy value is computed. Then, the average over ten spectral entropy values is calculated for each specific ω parameter value. Figure 3 shows the average spectral entropy values plotted versus the binding strength parameter. The error bars represent standard deviation values. The results indicate that the robustness of the coupled systems against noise does not vary significantly when the binding strength changes. Therefore, the binding mechanism used to couple two oscillatory systems is resilient to noise.

4. Discussion

The brain’s ability to construct its own space and time provides an essential foundation for conscious processing of the outside world. All other phenomenal aspects are built upon this foundation. When we have perceptual experiences of different phenomenal aspects, such as those related to our perceptions of colors, odors, tactile features, etc., they are always inseparably unified with our internal representations of space and time. Thus, a solution to the perceptual binding problem must include phenomenal space and time as the common foundation that unifies other phenomenal aspects into a single experience.
In this work, I present a mechanistic stochastic model of perceptual binding between encoded space and time. Because variations in spatial patterns and temporal changes could, in principle, be perceived as separable events, I assume that phenomenal representations of space and time can be unbound, or weakly or strongly bound. Thus, the mechanistic model of binding is used to investigate how the oscillating processes that encode the internal (phenomenal) representations of space and time are modulated when the binding strength between them is varied. Furthermore, stochastic simulations of the model are used to analyze the interplay between the binding strength and noise. The model results suggest that the binding mechanism is robust against inherent noise. Therefore, the model provides some explanation as to why perceptual experiences and perceptual binding can be robustly retained and unchanged despite ubiquitous noise sources in neuronal circuits. In my previous work, it was shown that large systems involving more interconnected oscillating processes are less noise sensitive than small systems with fewer processes [12]. Noise is suppressed in large systems by negative feedback loops that are involved in a network of interconnected processes. The peaks of power spectral densities were shifted from low to high frequency values with an increase in the number of processes [12]. Figure 2c,f,i also show a shift in normalized power spectrum peaks to higher frequencies as the binding strength increases. This shift to higher frequencies occurs because the binding mechanism involves negative feedback loops, as seen in Figure 1a. Similar observations have been reported for gene regulatory networks with negative feedback loops [42,43]. Moreover, it has also been observed that noise-like signals associated with scale-free brain activity may play an important role in determining the state of consciousness [44]. Therefore, an important future direction would be to apply the mechanistic model to investigate the implications of scale-free dynamics on perceptual binding.
In this study, the interplay between binding strength and noise is characterized using power spectral density and spectral entropy values. Spectral analysis has often been used to analyze electroencephalograms to study the neurophysiology of sleep [34], predict changes in memory performance [41], and detect differences in brain activities of subjects under different conditions [35,36,37]. Because the same spectral analysis tools are used in this work, it should be relatively easy to compare results and validate conclusions derived from the model simulations with results obtained in neurophysiological experiments.
However, it should be noted that the oscillating processes described by the model cannot be explicitly related to membrane potentials and the spiky oscillations exhibited by individual neurons. The dynamics of a spiking neuron can be better represented by the Hodgkin-Huxley and FitzHugh–Nagumo models, which employ nonlinear differential equations [45,46,47,48]. However, the nonlinear neuronal impulses have complex relationships that are not isomorphic with Euclidian space and time, which are the subjects of this study. The processes described by my model can be attributed to the dynamics of neural populations [49]. For example, averaged evoked potentials (AEP) recorded from different parts of the brain using electroencephalography (EEG) are well-fitted using sinusoidal functions [50,51]. Linearized approximation has been successfully applied to describe cortical evoke potentials [50,51,52,53]. Thus, the model results can be compared with neural population responses recorded by EEG techniques. Some oscillating patterns of the EEG with varying amplitudes are similar to the oscillation patterns obtained in this work (compare Figure 2a in this work with Figure 3 on page 33 in Ref. [50]). Although many EEG signals appear complex and noisy, the principle of superposition can be applied to separate the complex composite signals into components [52]. Then, in line with the hypothesis employed in my model, the oscillating electric field components that form relationships isomorphic with a conscious percept can contribute to the conscious state. It is possible that no “meaningful” contribution can emerge from electric field components that do not retain the relationships isomorphic with the percept.
It should also be noted that the internal representations of space and time modeled in this work are different from the inner space and time postulated in the temporo-spatial theory of consciousness (TTC) [6,7]. The inner time in TTC is related to the temporal ranges of neural oscillations that arise in different forms of neural activity. The inner space is related to spatial ranges of neural activity across different regions in the brain. Therefore, inner space and time in TTC are constructed using characteristic spatial ranges and timescales in different forms of neural activity. The inner space and time in TTC are, thus, different from the internal phenomenal representations of space and time. Furthermore, TTC suggests that binding between different forms of neural activity is determined by “temporo-spatial alignment”. Temporo-spatial alignment dictates whether different forms of neural activity and their respective contents can be merged and associated with consciousness. Importantly, the temporal integration of different forms of neural activity is based on their temporal properties and is independent of their specific contents. Thus, TTC highlights the difference between binding based on the temporal alignment and a content-based integration. Similarly, the binding mechanism in my model concerns interactions among oscillating processes. The mechanism permits the mutual modulation of processes regardless of specific phenomenal content carried by the processes. However, content integration is governed by relationships among processes and their changes occurring due to interactions.
Overall, the mechanistic model allows us to better understand how binding can alter dynamics of neural-like oscillatory systems. The model results can help to interpret some neural population activity patterns as recorded by EEG techniques. Furthermore, the model can be applied to describe the binding mechanism between any two percepts that can be represented by two systems of oscillating processes, as has been demonstrated in my previous work [11]. The stochastic version of the model gives us a useful tool to study noise effects on systems that involve binding. It can also be used to investigate mechanisms with which the brain suppresses or employs inherent noise to make our perceptual binding and experiences sturdy.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A

  • The XPP/XPPAUT Code A is used to simulate results in Figure 1b–d.
  # Code A
  init p1=1, p2=0, x1=0, x2=0, q1=1, q2=0, y1=0, y2=0
  par eps=-1.0, alpha=-1.0, w=1.0
		
  dp1/dt =eps*p2-p1-x1+w*q1
  dp2/dt =eps*p1-p2-x2+w*q2
  dx1/dt =p1
  dx2/dt =p2
  dq1/dt =alpha*q2-q1-z1-w*p1
  dq2/dt =alpha*q1-q2-z2-w*p2
  dz1/dt =q1
  dz2/dt =q2
		
  @ dt=.025, total=100, xplot=t, yplot=p1
  @ xmin=0, xmax=100, ymin=-1, ymax=1
  done
		
  • The XPP/XPPAUT Code B is used to generate trajectories and histograms in Figure 2.
  # Code B
		
  init p1=1000, p2=0, x1=0, x2=0, q1=1, q2=0, z1=0, z2=0
  par eps=-1.0, alpha=-1.0, w=1
		
  # compute the cumulative event
  x11=abs(eps*p2)
  x12=x11+abs(eps*p1)
		
  x13=x12+abs(alpha*q2)
  x21=x13+abs(alpha*q1)
		
  x22=x21+abs(p1)
  x23=x22+abs(p2)
  x31=x23+abs(q1)
  x32=x31+abs(q2)
		
  x33=x32+abs(x1)
  x41=x33+abs(x2)
  x42=x41+abs(z1)
  x43=x42+abs(z2)
		
  #binding
		
  x44=x43+abs(w*p1)
  x51=x44+abs(w*p2)
  x52=x51+abs(w*q1)
  x53=x52+abs(w*q2)
		
  # choose a random event
  s2=ran(1)*x53
  y1=(s2<x11)
  y2=(s2<x12)&(s2>=x11)
  y3=(s2<x13)&(s2>=x12)
  y4=(s2<x21)&(s2>=x13)
  y5=(s2<x22)&(s2>=x21)
  y6=(s2<x23)&(s2>=x22)
  y7=(s2<x31)&(s2>=x23)
  y8=(s2<x32)&(s2>=x31)
  y9=(s2<x33)&(s2>=x32)
  y10=(s2<x41)&(s2>=x33)
  y11=(s2<x42)&(s2>=x41)
  y12=(s2<x43)&(s2>=x42)
  y13=(s2<x44)&(s2>=x43)
  y14=(s2<x51)&(s2>=x44)
  y15=(s2<x52)&(s2>=x51)
  y16=(s2>=x52)
		
  # time for the next event
  tr’=tr-log(ran(1))/x53
		
  p1’=p1+sign(eps)*sign(p2)*y1-sign(p1)*y5-sign(x1)*y9+sign(q1)*y15
  p2’=p2+sign(eps)*sign(p1)*y2-sign(p2)*y6-sign(x2)*y10+sign(q2)*y16
  q1’=q1+sign(alpha)*sign(q2)*y3-sign(q1)*y7-sign(z1)*y11-sign(p1)*y13
  q2’=q2+sign(alpha)*sign(q1)*y4-sign(q2)*y8-sign(z2)*y12-sign(p2)*y14
  x1’=x1+sign(p1)*y5
  x2’=x2+sign(p2)*y6
  z1’=z1+sign(q1)*y7
  z2’=z2+sign(q2)*y8
		
  @ bound=100000000, meth=discrete, total=10000000, njmp=1000
  @ xp=tr, yp=p1
  @ xlo=0, ylo=-1000, xhi=40, yhi=1000
		
  done
		

References

  1. Roskies, A.L. The binding problem. Neuron 1999, 24, 7–9. [Google Scholar] [CrossRef] [PubMed]
  2. von der Malsburg, C. The what and why of binding: The modeler’s perspective. Neuron 1999, 24, 95–104. [Google Scholar] [CrossRef]
  3. von der Malsburg, C. Binding in models of perception and brain function. Curr. Opin. Neurobiol. 1995, 5, 520–526. [Google Scholar] [CrossRef] [PubMed]
  4. Gray, C.M. The temporal correlation hypothesis of visual feature integration: Still alive and well. Neuron 1999, 24, 31–47. [Google Scholar] [CrossRef] [PubMed]
  5. Singer, W. Neuronal synchrony: A versatile code for the definition of relations? Neuron 1999, 24, 49–65. [Google Scholar] [CrossRef]
  6. Northoff, G. What the brain’s intrinsic activity can tell us about consciousness? A tri-dimensional view. Neurosci. Biobehav. Rev. 2013, 37, 726–738. [Google Scholar] [CrossRef]
  7. Northoff, G.; Huang, Z.R. How do the brain’s time and space mediate consciousness and its different dimensions? Temporo-spatial theory of consciousness (TTC). Neurosci. Biobehav. Rev. 2017, 80, 630–645. [Google Scholar] [CrossRef]
  8. Fingelkurts, A.A.; Fingelkurts, A.A. Mind as a nested operational architectonics of the brain. Phys. Life Rev. 2012, 9, 49–50. [Google Scholar] [CrossRef]
  9. Fingelkurts, A.A.; Fingelkurts, A.A.; Neves, C.F.H. Phenomenological architecture of a mind and Operational Architectonics of the brain: The unified metastable continuum. J. New Math. Nat. Comput. 2009, 5, 221–244. [Google Scholar] [CrossRef]
  10. O’Reilly, R.C.; Busby, R.; Soto, R. Three forms of binding and their neural substrates: Alternatives to temporal synchrony. In The Unity of Consciousness; Cleeremans, A., Ed.; Oxford University Press: Oxford, UK, 2003; pp. 168–192. [Google Scholar]
  11. Kraikivski, P. A Dynamic Mechanistic Model of Perceptual Binding. Mathematics 2022, 10, 1135. [Google Scholar] [CrossRef]
  12. Kraikivski, P. Implications of Noise on Neural Correlates of Consciousness: A Computational Analysis of Stochastic Systems of Mutually Connected Processes. Entropy 2021, 23, 583. [Google Scholar] [CrossRef]
  13. Azouz, R.; Gray, C.M. Cellular mechanisms contributing to response variability of cortical neurons in vivo. J. Neurosci. 1999, 19, 2209–2223. [Google Scholar] [CrossRef] [PubMed]
  14. Faisal, A.A.; Selen, L.P.; Wolpert, D.M. Noise in the nervous system. Nat. Rev. Neurosci. 2008, 9, 292–303. [Google Scholar] [CrossRef] [PubMed]
  15. Mainen, Z.F.; Sejnowski, T.J. Reliability of spike timing in neocortical neurons. Science 1995, 268, 1503–1506. [Google Scholar] [CrossRef] [PubMed]
  16. Steinmetz, P.N.; Manwani, A.; Koch, C.; London, M.; Segev, I. Subthreshold voltage noise due to channel fluctuations in active neuronal membranes. J. Comput. Neurosci. 2000, 9, 133–148. [Google Scholar] [CrossRef]
  17. White, J.A.; Rubinstein, J.T.; Kay, A.R. Channel noise in neurons. Trends Neurosci. 2000, 23, 131–137. [Google Scholar] [CrossRef]
  18. Calvin, W.H.; Stevens, C.F. Synaptic noise and other sources of randomness in motoneuron interspike intervals. J. Neurophysiol. 1968, 31, 574–587. [Google Scholar] [CrossRef]
  19. Fellous, J.M.; Rudolph, M.; Destexhe, A.; Sejnowski, T.J. Synaptic background noise controls the input/output characteristics of single cells in an in vitro model of in vivo activity. Neuroscience 2003, 122, 811–829. [Google Scholar] [CrossRef]
  20. Jacobson, G.A.; Diba, K.; Yaron-Jakoubovitch, A.; Oz, Y.; Koch, C.; Segev, I.; Yarom, Y. Subthreshold voltage noise of rat neocortical pyramidal neurones. J. Physiol. 2005, 564, 145–160. [Google Scholar] [CrossRef]
  21. Faisal, A.A.; Laughlin, S.B. Stochastic simulations on the reliability of action potential propagation in thin axons. PLoS Comput. Biol. 2007, 3, e79. [Google Scholar] [CrossRef]
  22. van Rossum, M.C.; O’Brien, B.J.; Smith, R.G. Effects of noise on the spike timing precision of retinal ganglion cells. J. Neurophysiol. 2003, 89, 2406–2419. [Google Scholar] [CrossRef]
  23. Longtin, A.; Bulsara, A.; Moss, F. Time-interval sequences in bistable systems and the noise-induced transmission of information by sensory neurons. Phys. Rev. Lett. 1991, 67, 656–659. [Google Scholar] [CrossRef] [PubMed]
  24. Collins, J.J.; Imhoff, T.T.; Grigg, P. Noise-enhanced information transmission in rat SA1 cutaneous mechanoreceptors via aperiodic stochastic resonance. J. Neurophysiol. 1996, 76, 642–645. [Google Scholar] [CrossRef] [PubMed]
  25. Douglass, J.K.; Wilkens, L.; Pantazelou, E.; Moss, F. Noise enhancement of information transfer in crayfish mechanoreceptors by stochastic resonance. Nature 1993, 365, 337–340. [Google Scholar] [CrossRef]
  26. Bulsara, A.R.; Zador, A. Threshold detection of wideband signals: A noise-induced maximum in the mutual information. Phys. Rev. E Stat. Phys. Plasmas Fluids Relat. Interdiscip. Top. 1996, 54, R2185–R2188. [Google Scholar] [CrossRef] [PubMed]
  27. Kosko, B.; Mitaim, S. Stochastic resonance in noisy threshold neurons. Neural Netw. 2003, 16, 755–761. [Google Scholar] [CrossRef]
  28. Mitaim, S.; Kosko, B. Adaptive stochastic resonance in noisy neurons based on mutual information. IEEE Trans. Neural Netw. 2004, 15, 1526–1540. [Google Scholar] [CrossRef]
  29. Kraikivski, P. Building Systems Capable of Consciousness. Mind Matter 2017, 15, 185–195. [Google Scholar]
  30. Kraikivski, P. Systems of Oscillators Designed for a Specific Conscious Percept. New Math. Nat. Comput. 2020, 16, 73–88. [Google Scholar] [CrossRef]
  31. Edelman, G.M. Naturalizing consciousness: A theoretical framework. Proc. Natl. Acad. Sci. USA 2003, 100, 5520–5524. [Google Scholar] [CrossRef]
  32. Balduzzi, D.; Tononi, G. Integrated information in discrete dynamical systems: Motivation and theoretical framework. PLoS Comput. Biol. 2008, 4, e1000091. [Google Scholar] [CrossRef]
  33. Gillespie, D.T. Exact stochastic simulation of coupled chemical reactions. J. Phys. Chem. 1977, 81, 2340–2361. [Google Scholar] [CrossRef]
  34. Prerau, M.J.; Brown, R.E.; Bianchi, M.T.; Ellenbogen, J.M.; Purdon, P.L. Sleep Neurophysiological Dynamics Through the Lens of Multitaper Spectral Analysis. Physiology 2017, 32, 60–92. [Google Scholar] [CrossRef] [PubMed]
  35. Tuominen, J.; Kallio, S.; Kaasinen, V.; Railo, H. Segregated brain state during hypnosis. Neurosci. Conscious. 2021, 2021, niab002. [Google Scholar] [CrossRef] [PubMed]
  36. Thilakavathi, B.; Shenbaga Devi, S.; Malaiappan, M.; Bhanu, K. EEG power spectrum analysis for schizophrenia during mental activity. Australas. Phys. Eng. Sci. Med. 2019, 42, 887–897. [Google Scholar] [CrossRef] [PubMed]
  37. Helakari, H.; Kananen, J.; Huotari, N.; Raitamaa, L.; Tuovinen, T.; Borchardt, V.; Rasila, A.; Raatikainen, V.; Starck, T.; Hautaniemi, T.; et al. Spectral entropy indicates electrophysiological and hemodynamic changes in drug-resistant epilepsy—A multimodal MREG study. Neuroimage Clin. 2019, 22, 101763. [Google Scholar] [CrossRef] [PubMed]
  38. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  39. Tononi, G. An information integration theory of consciousness. BMC Neurosci. 2004, 5, 42. [Google Scholar] [CrossRef] [PubMed]
  40. Seth, A.K.; Izhikevich, E.; Reeke, G.N.; Edelman, G.M. Theories and measures of consciousness: An extended framework. Proc. Natl. Acad. Sci. USA 2006, 103, 10799–10804. [Google Scholar] [CrossRef]
  41. Tian, Y.; Zhang, H.; Xu, W.; Zhang, H.; Yang, L.; Zheng, S.; Shi, Y. Spectral Entropy Can Predict Changes of Working Memory Performance Reduced by Short-Time Training in the Delayed-Match-to-Sample Task. Front. Hum. Neurosci. 2017, 11, 437. [Google Scholar] [CrossRef]
  42. Austin, D.W.; Allen, M.S.; McCollum, J.M.; Dar, R.D.; Wilgus, J.R.; Sayler, G.S.; Samatova, N.F.; Cox, C.D.; Simpson, M.L. Gene network shaping of inherent noise spectra. Nature 2006, 439, 608–611. [Google Scholar] [CrossRef]
  43. Simpson, M.L.; Cox, C.D.; Sayler, G.S. Frequency domain analysis of noise in autoregulated gene circuits. Proc. Natl. Acad. Sci. USA 2003, 100, 4551–4556. [Google Scholar] [CrossRef]
  44. He, B.J.; Zempel, J.M.; Snyder, A.Z.; Raichle, M.E. The temporal structures and functional significance of scale-free brain activity. Neuron 2010, 66, 353–369. [Google Scholar] [CrossRef]
  45. Hodgkin, A.L.; Huxley, A.F. A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 1952, 117, 500–544. [Google Scholar] [CrossRef] [PubMed]
  46. Fitzhugh, R. Impulses and Physiological States in Theoretical Models of Nerve Membrane. Biophys. J. 1961, 1, 445–466. [Google Scholar] [CrossRef] [PubMed]
  47. Nagumo, J.; Arimoto, S.; Yoshizawa, S. An Active Pulse Transmission Line Simulating Nerve Axon. Proc. IRE 1962, 50, 2061–2070. [Google Scholar] [CrossRef]
  48. Izhikevich, E.M. Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting; Computational Neuroscience; MIT Press: Cambridge, MA, USA, 2007; p. xvi. 441p. [Google Scholar]
  49. Freeman, W.J. Models of the dynamics of neural populations. Electroencephalogr. Clin. Neurophysiol. Suppl. 1978, 34, 9–18. [Google Scholar]
  50. Freeman, W.J.; SpringerLink (Online Service). Neurodynamics: An Exploration in Mesoscopic Brain Dynamics. In Perspectives in Neural Computing; Springer: London, UK, 2000; p. X. 398p. [Google Scholar]
  51. Freeman, W.J. Linear approximation of prepyriform evoked potential in cats. Exp. Neurol. 1962, 5, 477–499. [Google Scholar] [CrossRef]
  52. Biedenbach, M.A.; Freeman, W.J. Linear Domain of Potentials from the Prepyriform Cortex with Respect to Stimulus Parameters. Exp. Neurol. 1965, 11, 400–417. [Google Scholar] [CrossRef]
  53. Freeman, W.J. Linear analysis of the dynamics of neural masses. Annu. Rev. Biophys. Bioeng. 1972, 1, 225–256. [Google Scholar] [CrossRef]
Figure 1. The interaction diagram and dynamical relationships among processes. (a) A diagram that shows interactions among processes. (bd) Numerical solutions for the time evolution of P ( t ) = ( p 1 ( t ) ,   p 2 ( t ) ) and Q ( t ) = ( q 1 t ,   q 2 t ) processes computed using different values of the coupling constant: (b) ω = 0.1 , (c) ω = 0.5 , (d) ω = 1 . In all simulations, ε = 1 and α = 1 , thus, representing mutual inhibition between the p 1 and p 2 processes and between the q 1 and q 2 processes. The following initial conditions are used: P ( 0 ) = ( 1 ,   0 ) , X ( 0 ) = ( 0 ,   0 ) , and Q ( 0 ) = 1 ,   0 , Z ( 0 ) = ( 0 ,   0 ) .
Figure 1. The interaction diagram and dynamical relationships among processes. (a) A diagram that shows interactions among processes. (bd) Numerical solutions for the time evolution of P ( t ) = ( p 1 ( t ) ,   p 2 ( t ) ) and Q ( t ) = ( q 1 t ,   q 2 t ) processes computed using different values of the coupling constant: (b) ω = 0.1 , (c) ω = 0.5 , (d) ω = 1 . In all simulations, ε = 1 and α = 1 , thus, representing mutual inhibition between the p 1 and p 2 processes and between the q 1 and q 2 processes. The following initial conditions are used: P ( 0 ) = ( 1 ,   0 ) , X ( 0 ) = ( 0 ,   0 ) , and Q ( 0 ) = 1 ,   0 , Z ( 0 ) = ( 0 ,   0 ) .
Entropy 26 00133 g001
Figure 2. Numerical stochastic simulation results obtained using the following binding strength parameter values: (ac) ω = 0.1, (df) ω = 0.5, and (gi) ω = 1. (a,d,g) Stochastic trajectories for the P ( t ) = ( p 1 ( t ) ,   p 2 ( t ) ) and Q ( t ) = ( q 1 t ,   q 2 t ) processes. Time is shown in arbitrary units. The following initial conditions are used: P ( 0 ) = ( 1000 ,   0 ) , X ( 0 ) = ( 0 ,   0 ) , Q ( 0 ) = 1 ,   0 , Z ( 0 ) = ( 0 ,   0 )  in (a,g) and P ( 0 ) = ( 1000 ,   0 ) , X ( 0 ) = ( 0 ,   0 ) , Q ( 0 ) = 1000 ,   0 , Z ( 0 ) = ( 0 ,   0 ) in (d). (b,e,h) Distribution histograms for process p 1 , computed using trajectories recorded over (b) 1580 arb. u., (e) 1120 arb. u., and (h) 1071 arb. u time frames. (c,f,i) Normalized power spectral densities for process p 1 .
Figure 2. Numerical stochastic simulation results obtained using the following binding strength parameter values: (ac) ω = 0.1, (df) ω = 0.5, and (gi) ω = 1. (a,d,g) Stochastic trajectories for the P ( t ) = ( p 1 ( t ) ,   p 2 ( t ) ) and Q ( t ) = ( q 1 t ,   q 2 t ) processes. Time is shown in arbitrary units. The following initial conditions are used: P ( 0 ) = ( 1000 ,   0 ) , X ( 0 ) = ( 0 ,   0 ) , Q ( 0 ) = 1 ,   0 , Z ( 0 ) = ( 0 ,   0 )  in (a,g) and P ( 0 ) = ( 1000 ,   0 ) , X ( 0 ) = ( 0 ,   0 ) , Q ( 0 ) = 1000 ,   0 , Z ( 0 ) = ( 0 ,   0 ) in (d). (b,e,h) Distribution histograms for process p 1 , computed using trajectories recorded over (b) 1580 arb. u., (e) 1120 arb. u., and (h) 1071 arb. u time frames. (c,f,i) Normalized power spectral densities for process p 1 .
Entropy 26 00133 g002
Figure 3. Dependence of spectral entropy on the coupling strength between two bound oscillatory systems. Open circles represent the average spectral entropy values obtained using different values of the binding strength parameter ω . Error bars provide standard deviation values.
Figure 3. Dependence of spectral entropy on the coupling strength between two bound oscillatory systems. Open circles represent the average spectral entropy values obtained using different values of the binding strength parameter ω . Error bars provide standard deviation values.
Entropy 26 00133 g003
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kraikivski, P. A Mechanistic Model of Perceptual Binding Predicts That Binding Mechanism Is Robust against Noise. Entropy 2024, 26, 133. https://doi.org/10.3390/e26020133

AMA Style

Kraikivski P. A Mechanistic Model of Perceptual Binding Predicts That Binding Mechanism Is Robust against Noise. Entropy. 2024; 26(2):133. https://doi.org/10.3390/e26020133

Chicago/Turabian Style

Kraikivski, Pavel. 2024. "A Mechanistic Model of Perceptual Binding Predicts That Binding Mechanism Is Robust against Noise" Entropy 26, no. 2: 133. https://doi.org/10.3390/e26020133

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop