Next Article in Journal
Modeling South African Stock Prices with Mixture Distributions
Previous Article in Journal
Assessing the Oil Price–Exchange Rate Nexus: A Switching Regime Evidence Using Fractal Regression
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Emergent Behavior and Computational Capabilities in Nonlinear Systems: Advancing Applications in Time Series Forecasting and Predictive Modeling †

by
Kárel García-Medina
1,2,*,‡,§,
Daniel Estevez-Moya
2,3,§,
Ernesto Estevez-Rams
2,§ and
Reinhard B. Neder
1,§
1
Lehrstuhl für Kristallographie und Strukturphysik, Friedrich-Alexander-Universität Erlangen-Nürnberg, Staudtstrasse 3, 91056 Erlangen, Germany
2
Facultad de Física-IMRE, Universidad de La Habana, San Lazaro y L, La Habana CP 10400, Cuba
3
Max-Planck-Institut für Physik Komplexer Systeme, Nöthnitzer Str. 38, 01187 Dresden, Germany
*
Author to whom correspondence should be addressed.
Presented at the 11th International Conference on Time Series and Forecasting, Canaria, Spain, 16–18 July 2025.
Current address: Nanoparticles and Disordered Systems, Kristallographie und Strukturphysik, Friedrich-Alexander-Universität Erlangen-Nürnberg, Staudtstrasse 3, 91056 Erlangen, Bayern, Germany.
§
These authors contributed equally to this work.
Comput. Sci. Math. Forum 2025, 11(1), 17; https://doi.org/10.3390/cmsf2025011017
Published: 11 August 2025

Abstract

Natural dynamical systems can often display various long-term behaviours, ranging from entirely predictable decaying states to unpredictable, chaotic regimes or, more interestingly, highly correlated and intricate states featuring emergent phenomena. That, of course, imposes a level of generality on the models we use to study them. Among those models, coupled oscillators and cellular automata (CA) present a unique opportunity to advance the understanding of complex temporal behaviours because of their conceptual simplicity but very rich dynamics. In this contribution, we review the work completed by our research team over the last few years in the development and application of an alternative information-based characterization scheme to study the emergent behaviour and information handling of nonlinear systems, specifically Adler-type oscillators under different types of coupling: local phase-dependent (LAP) coupling and Kuramoto-like local (LAK) coupling. We thoroughly studied the long-term dynamics of these systems, identifying several distinct dynamic regimes, ranging from periodic to chaotic and complex. The systems were analysed qualitatively and quantitatively, drawing on entropic measures and information theory. Measures such as entropy density (Shannon entropy rate), effective complexity measure, and Lempel–Ziv complexity/information distance were employed. Our analysis revealed similar patterns and behaviours between these systems and CA, which are computationally capable systems, for some specific rules and regimes. These findings further reinforce the argument around computational capabilities in dynamical systems, as understood by information transmission, storage, and generation measures. Furthermore, the edge of chaos hypothesis (EOC) was verified in coupled oscillators systems for specific regions of parameter space, where a sudden increase in effective complexity measure was observed, indicating enhanced information processing capabilities. Given the potential for exploiting this non-anthropocentric computational power, we propose this alternative information-based characterization scheme as a general framework to identify a dynamical system’s proximity to computationally enhanced states. Furthermore, this study advances the understanding of emergent behaviour in nonlinear systems. It explores the potential for leveraging the features of dynamical systems operating at the edge of chaos by coupling them with computationally capable settings within machine learning frameworks, specifically by using them as reservoirs in Echo State Networks (ESNs) for time series forecasting and predictive modeling. This approach aims to enhance the predictive capacity, particularly that of chaotic systems, by utilising EOC systems’ complex, sensitive dynamics as the ESN reservoir.

1. Introduction

Natural dynamical systems often display a wide range of long-term behaviors, varying from entirely predictable decaying states to unpredictable, chaotic regimes, or, more interestingly, highly correlated and intricate states featuring emergent phenomena [1,2,3,4,5,6]. This diversity of behaviors dictates that the models used to study them possess an adequate level of generality. Among these models, coupled oscillators and cellular automata (CA) stand out as offering a unique opportunity to advance the understanding of complex temporal behaviors, due to their conceptual simplicity combined with very rich dynamics [7,8,9,10,11,12,13,14,15,16,17,18].
The application of this general dynamical and computational characterization scheme has led us to the potential verification of the edge of chaos hypothesis (EOC) in other nonlinear systems like cellular automata [12]. This hypothesis proposes that nonlinear systems exhibit enhanced computational capabilities near chaotic regimes or the transition to them. This enhancement is often signalled by sudden increases in correlation distances and effective complexity measure, indicating an improved level of structuring and transmission of information across scales, alongside moderate levels of randomness allowing for sensitive manipulation of information. The cellular automata are especially attractive for the exploration of such a proposition, given their nature, but they are also complicated to treat, being a discrete space of local dynamical rules [10,19]. Following the ideas in [20], we implemented a continuous search in the space of generalized continuous elementary cellular automata (GECA), which extends the concept of discrete elementary cellular automata (ECA) by introducing continuous transition parameters [12]. This allowed us to find sudden peaks in effective complexity measures for generalized local rules near or at transitions toward ECA rules with a known chaotic nature [10,19,21].
Given the potential for exploiting dynamical systems’ natural intrinsic computational power, one could benefit from designing a general framework to identify the system’s proximity to these computationally enhanced states. This study advances the understanding of nonlinear systems’ emergent behaviour and computational capabilities. It explores the potential for leveraging the features of dynamical systems operating at the edge of chaos by coupling them with computationally capable settings within machine learning frameworks. Specifically, we focus on their use as reservoirs in Echo State Networks (ESNs) for time series forecasting and predictive modeling. This approach seeks to enhance predictive capacity, particularly for chaotic systems like the Mackey–Glass system, by utilizing EOC systems’ complex, sensitive dynamics as the ESN reservoir [22].
This paper summarizes our findings on the emergent behaviour and computational capabilities of coupled Adler-like oscillators, characterized through information theory. We argue for the EOC as a region of enhanced information processing and highlight the similarities between potentially computationally capable systems. Finally, we explore the practical integration of these systems with computational capabilities at the EOC as reservoirs in ESNs, indicating their potential for improving chaotic time series forecasting.
This paper is organized as follows: Section 2 details the studied models and the entropic measures used for their characterization. Section 3 presents the results of the entropic analysis and discusses the identification of computational capabilities. Section 4 compares the results of the nonlinear systems with those of cellular automata, highlighting the similarities in their dynamics and computational capabilities. Section 5 discusses the integration of EOC systems as ESN reservoirs and presents the results of predictive modeling using this approach. Finally, Section 6 provides conclusions and outlines future research directions.

2. Information-Based Metrics

For a system to perform significant computation, it must achieve a balance between information storage and production. Measures of computational capabilities aim to quantify this balance in such systems. Entropy-based metrics are particularly useful for this purpose, as they can quantify the amount of information generated by a system and its ability to process that information. Key metrics used in this approach are detailed below.

2.1. Entropy Density

Shannon Entropy is a fundamental measure of uncertainty or randomness in a sequence or process. For a discrete random variable X, Shannon’s entropy H ( X ) represents the irreducible level of non-redundant information in a set of observations. Consider a bi-infinite sequence S = s n s 2 s 1 s 0 s 1 s 2 s n of discrete random variables taking values from a finite alphabet χ , as well as a subsequence S L of length L. Shannon’s block entropy can then be defined as [12]
H S ( L ) = S L χ L p ( S L ) log ( p ( S L ) )
where the sum is taken over all possible blocks of length L, and p ( S L ) is the probability of observing a particular subsequence S L . The entropy density can be subsequently defined as [12]
h = lim L H S ( L ) L
As L , a message of L symbols asymptotically requires only h × L bits instead of log | χ | × L bits, consistent with the idea of h as a measure of irreducible randomness of a bi-infinite sequence once all sources of predictability have been considered [17]. Entropy density is a fundamental indicator for discerning the system’s state, revealing transitions from predictable to random behaviour.

2.2. Effective Complexity Measure

A natural measure of structurality and correlations can be defined through the effective complexity measure (E). Introduced by Grassberger [23], it quantifies the intrinsic redundancy in a sequence of observations for a given information source, measuring the correlation across scales. Effective complexity can be defined as the cumulative convergence of the defect on entropy density [24]:
E = L = 1 [ h L h ]
with h L = H S ( L ) H S ( L 1 ) being a sort of finite-length estimate of entropy density. The effective complexity measure is then a cumulative measure of how much the entropy density is overestimated at each L, given that only blocks of length up to L have been observed. In other words, it quantifies how much information still needs to be learned to determine the true entropy density of a source. An alternative interpretation is the mutual information between two infinite halves of a bi-infinite string, indicating how much information one half contains about the other.

2.3. Information Distance

A final measure of sensitivity was defined through information distance as in [25]. Information distance (d) can be derived from Kolmogorov randomness K ( S ) [26] as follows:
d ( S , S ) = m a x [ K ( S | S * ) , K ( S | S * ) ] m a x [ K ( S ) , K ( S ) ]
with S * = | K ( S ) | being the length of the shortest program that can reproduce sequence S when running on a Universal Turing Machine (UTM). Accordingly, Kolmogorov conditional randomness K ( S | S * ) is the length of the shortest program that can reproduce sequence S given S * , the shortest program that can reproduce sequence S . Unlike Hamming distance, information distance measures how innovative one sequence is with respect to the other from an algorithmic point of view.
Estimation of all previous quantities for finite-size systems was carried out as described in [12].

2.4. Entropic Measures of Computational Complexity

From a general operational point of view, a system with computational power should, in principle, be able to perform three basic tasks: storage, transmission, and sensible modification of incoming information. With that in mind, a computationally capable system would be incompatible with extreme values of entropy density, since both completely trivial states ( h 0 ) and completely random states ( h 1 ) destroy any meaningful information encoded in the system’s initial conditions. This is equally true for effective complexity, where fully uncorrelated states ( E 0 ) and fully correlated states ( E ) imply information loss. Computational power lies, thus, in a sweet spot between pattern emergence and randomness, allowing for innovation and adaptability [12]. This is the approach we follow for quantitatively characterizing general computational power in dynamical systems. Regimes with medium values of entropy density and reasonable values of effective complexity measure are taken to be computationally capable regimes. This interpretation has proven to be robust and efficiently describe the long-term dynamics of nonlinear systems in the sense of information processing [12,15,23,27].

3. Computational Capabilities in Nonlinear Systems

The methods described above have been applied to Adler-type oscillators under two different couplings, namely the local phase-dependent (LAP) [12] and Kuramoto-like local (LAK) [16,17] couplings. A nearest-neighbour periodic-boundary-condition topology was considered in both cases, meaning oscillator rings were studied. Furthermore, oscillators were coupled in an alternating manner as excitors (positively coupled) and inhibitors (negatively coupled), and an even number of units was considered, thus guaranteeing a global balance of forces in the ring [16]. Simulations were made for N = 5 × 10 2 units, left to evolve from uniformly distributed random configurations for T = 2 × 10 3 time-steps with a fixed interval of Δ T = 1 . Numerical integration was carried out using the Runge–Kutta (4,5) method, implemented in the GNU Scientific Library (GSL) with an error tolerance of 10 6 . Key findings are summarized below. Entropic measures were determined for binarized versions of real-valued spacetime matrices as in [17], taking the mean activity at any given instant as the threshold value and setting to 1 (0) all units with activity values above (below) it.

3.1. Local Phase-Dependent (LAP) Coupling

Under Winfree’s weak-coupling approximation [28], the Adler-type locally phase-dependent coupled oscillators model takes the form
θ ˙ i = ω + γ cos ( θ i ) + ( 1 ) i k [ cos ( θ i 1 ) + cos ( θ i + 1 ) ] .
where θ i is the phase of the i-th oscillator, ω is their natural frequency when in isolation, g a m m a is a positive feedback coefficient, and k is the coupling strength. Parameter space was explored for ( ω , γ ) [ 0 , 2 π ] in units of k [17]. Figure 1a shows the entropic landscape of parameter space for Equation (5) as extracted from [13], where several distinctive long-term regimes were identified. Plotted values of entropy density (h) were determined for the final states of the system after T = 2 × 10 3 time-steps across parameter space. An absorbing region [13,18] was found, characterized by the low values of h in Figure 1a, indicating an ordered, predictable long-term behaviour. When operating in this regime, the system quickly converges to an alternating stationary state, where all even and odd units evolve separately towards two different fixed points, depending on the parameter combination ( ω , γ ) . In its binary representation, this trivial long-term behaviour manifests as a simple periodic pattern of alternating 0 and 1 values with minimum entropy density.
A wedge region was also identified with intermediate values of h, potentially indicating the emergence of complex behavior characterized by the presence of non-trivial spatiotemporal patterns and large correlation distances [13,15]. Its complex nature has since been confirmed in [13,15,18] by both patttern inspection and quantitative analysis. Consequently, this region has been proposed as one of the most interesting for the study of computational capabilities in the system [13], along with a second needle region with intermediate values of h and embedded within a high-entropy region. The chaotic nature of this surrounding region was confirmed by means of information distance. Simulations were repeated, introducing a minimal single-bit perturbation to the initial condition, and sensitivity was evaluated over time. An argument can be made in favour of this approach as an alternative metric of chaos [16,17] in computationally capable systems.
As argued in [13], the use of the final state as a valid representation of the long-term dynamics of the system implicitly assumes the existence of a final settling dynamic for it, which might not necessarily be true for all parameter combinations. On the other hand, spatial complexity and temporal complexity levels might not be close at all, meaning spatial correlations across oscillators behave differently from time correlations for a specific one. With that in mind, time evolution was also considered in [13]. Transposed binary spacetime matrices were characterized, neglecting transient periods, which allowed for the identification of a secondary turbulent region within the chaotic one, characterized by slightly lower levels of spatial entropy and significantly lower levels of temporal entropy.
Finally, through the use of complementary figures of merit like symbol density ρ , measuring the density of 1’s at any given instant, a spoon region was found with distinctive dissipative behaviour [13].
Overall, entropic analysis allowed for the identification of several distinctive dynamic regimes, with two being the most promising in terms of computational complexity, as indicated by entropic markers, namely the wedge and needle regions.

3.2. Kuramoto-like Local (LAK) Coupling

A second coupling model was considered, where nearest neighbours interact proportionally to their phase difference [29]. The system is then described by
θ ˙ i = ω + γ cos ( θ i ) + ( 1 ) i k [ sin ( θ i 1 θ i ) + sin ( θ i + 1 θ i ) ] .
with Figure 2a, extracted from [17], showing the entropic landscape across the considered portion of parameter space. Similarly, an absorbing region was also identified, where, unlike previously, global phase synchronization was achieved. The definition of a local complex order parameter allowed for the detection of emergent local communities where coherence is first achieved and then transmitted across larger scales by competition mechanisms [16]. A highly sensitive chaotic region was also found.
More interestingly, two regions were identified with long-term long-range patterns in spacetime matrices, pointing to the complex nature of the system when operating in these regions. This was further confirmed quantitatively by the h and E plots in Figure 2. In both cases, the system was found to have intermediate levels of randomness and effective complexity, indicating the presence of non-trivial spatiotemporal patterns and large correlation distances, albeit one of them being more sensitive to initial conditions, as indicated by d values (see Figure 2a). Visual inspection of spacetime matrices in these regions confirms their complex nature [17].
Summing up, an absorbing region was found, characterized by globally synchronized final states with near-zero values of both h and E; a chaotic region with high values of h and d and two complex regions with intermediate values of h and E and different levels of sensitivity were also found. If present, computational power is thus expected to be found in the complex regions, where the system can produce non-trivial spatiotemporal patterns and larger correlation distances.

3.3. Edge of Chaos

Enhanced computational capabilities were identified at different instances in the parameter space of both models, as indicated by sudden jumps in the effective complexity levels at the verge of highly sensitive regimes. This is consistent with the edge of chaos hypothesis, which posits that nonlinear systems exhibit enhanced computational capabilities in the vicinity of chaotic regimes or the transition to them [12,24]. The presence of non-trivial spatiotemporal patterns and large correlation distances in the complex regions suggests that these systems can process information in a meaningful way, supporting the idea that they operate at the edge of chaos.
In the case of the LAP coupling, the γ = 1.2057 line of parameter space was thoroughly explored with the results shown in Figure 1b. The effective complexity measure was found to peak at roughly ω = 2.23 (point b at the border between needle and chaotic regions), right at the edge of a transition to a highly sensitive chaotic regime (point c). Furthermore, the entropy plot includes an h ρ line, which corresponds to the entropy density of a fully random sequence with the same symbol density ρ as the system. Any differences between h and h ρ would be due to non-trivial organization or pattern formation in the sequence. As the E plot indicates, enhanced computation happens right before these non-trivial patterns disappear and highly sensitive disorder takes over. Equally interesting is the fact that, right at the transition point, substantial fluctuations in entropy levels are observed, possibly related to a phase transition.
A similar line of parameter space was explored for the LAK coupling, this time along ω = 2 , providing similar results shown in Figure 2b. Right at the verge of a highly sensitive random regime, as indicated by d and h plots, the system displays a clear peak in effective complexity at γ = 2.832 . What is more surprising about this is the fact that visual inspection of the corresponding spacetime matrix fails to show any distinctive feature indicating enhanced information handling. Yet, our quantitative information-based framework seems to capture it at a deeper level [13,27].

4. Oscillators and Cellular Automata: A Comparison

Despite the approach failing to capture any distinct features regarding information handling in the case shown in Figure 2, the binarization and subsequent plotting of spacetime matrices at specific points in parameter space can help analyze long-term dynamics, as they are ideal instances to detect spacetime correlations. In this study specifically, it helped reveal similarities with other well-studied dynamical systems like ECA, which is known to be a computationally capable system.

4.1. Visual Comparison via Spatiotemporal Diagrams

Figure 3 shows a schematic representation of the stability region for the globally synchronized solution for the LAK model (Equation (6)), as extracted from [15]. Spacetime matrices for distinctive points along the ω = 2 line in parameter space are shown in the top-right panel. The different long-term dynamics identified for the model with entropic markers become evident in the different matrices. In the case of γ = 2.885 , belonging to the absorbing region, a homogeneous final state is observed, compatible with global phase synchronization. The γ = 2.525 and γ = 1.8 matrices clearly show complex long-lived long-range patterns of different nature, being in the two different complex regions identified earlier with different levels of sensitivity. Finally, randomness across space and time is observed in the γ = 1.02 matrix, as expected from the entropic markers values in the chaotic region.
Furthermore, the patterns observed across spacetime matrices, especially for γ = 2.525 , resemble those observed in ECA of different classes [10]. The bottom-right panel in Figure 3 shows spacetime matrices for representative rules of different Wolfram complexity classes for cellular automata [10,19,21,30]. It is clear how almost every characteristic pattern in the oscillatory model has its equivalent in the ECA, except, perhaps, the wave-like patterns found for γ = 1.8 . This similarity in dynamical range might extend beyond visual features and indicate computational equivalence. One could argue that the fact that coupled oscillator systems can sustain similar structures to those found in computationally capable systems, namely ECA, could be an indication of similar computational complexity [15]. This should not be understated. Coupled oscillator systems present a continuous, low-dimensional parameter space, which means thorough systematic exploration can be made. The ECA rule space, on the other hand, is a high-dimensional, highly pathological space with no clear sense of order or transition. When thinking of functional computational systems, adaptable, customizable systems are desired, which is easier to achieve if the system has a continuous, well-behaved configuration space to operate in. In light of this idea, the analysis can be extended to a quantitative comparison of both systems in terms of information handling capabilities.

4.2. Quantitative Comparison via Entropic Measures

Beyond visual inspection, Figure 4, as extracted from [12], allows for quantitative comparisons of both models to relevant cellular automation instances. The left column of Figure 4 shows entropy–complexity diagrams [13,31] for the GECA (top) and LAP (bottom) models. In the GECA case, the diagram was obtained by continuous deformation of the 6th bit in the ECA rule 78 as defined in [12,20]. Final entropy and complexity values are plotted. The LAP model was randomly explored in parameter space, with the final entropy and complexity values being plotted as well. The two curves turn out to be strikingly similar, with their main difference being a smoother curve for the LAP model, pointing to a continuous first derivative [12] in parameter space. Furthermore, both curves are very similar to those reported in [32] for the logistic map, characteristic of systems undergoing transitions towards chaos. These similarities further indicate that enhanced computational instances for these systems occur in the vicinity of chaos [12].
The right column of Figure 4, on the other hand, shows a comparison between the LAK model and specific ECA rules, in terms of their entropic markers and densities. Their values for the oscillators were analyzed along the ω = 2 line, and a quantitatively equivalent ECA rule is shown for each distinct dynamic. Points of analysis and selected rules match those in Figure 3. The closer the solid points in the plot to the different curves, the more similar the ECA and the system are for that parameter combination. As expected, the worst agreement is found for the wave-like patterns found at γ = 1.8 . The similarities drawn from Figure 3 are qualitative and quantitative.
As discussed in [12,33], entropy–complexity diagrams are a powerful tool for identifying computationally capable systems. The fact that the general entropic landscape for the coupled oscillator models is so similar to those of computationally capable systems points to the idea of these systems being ideal for the early integration of dynamical computation into general architectures of neural networks. In this contribution, we explored an initial step in this direction by functionally integrating GECA models into neural computing systems. An especially attractive candidate for this integration are Echo State Networks (ESN), which are a type of recurrent neural network (RNN) working within the framework of reservoir computing [22].

5. Dynamical Systems as Reservoirs in Echo State Networks

Echo State Networks (ESNs) are a type of recurrent neural network (RNN) that leverages the dynamics of complex systems for computational tasks, particularly time series prediction. Their architecture can be understood as a three-layered structure: an input, reservoir, and output layer. The reservoir is a fixed internal structure where complex dynamics emerge. Only output weights are modified during training, while the reservoir’s internal state acts as a dynamic memory, transforming the history of the input signal.
The general nature of the reservoir makes this an ideal candidate for testing the integration of natural computation into neural network frameworks. The Echo State Property (ESP) [22], however, imposes a constraint on the reservoir’s dynamics, ensuring that its state is uniquely determined by the input history. This property is crucial for the reservoir to effectively act as an "echo" of past inputs, independent of the initial conditions of its internal state. Precisely, there lies a key challenge when integrating natural systems into the ESN framework: the formulation and evaluation of the ESP. The reservoir must be able to maintain a memory of past inputs while also being sensitive to new information, which is a delicate balance to achieve.
An initial, and very promising, approach to this problem is shown in Figure 5, where the conventional reservoir (often a Watts–Strogatz or Erdős–Rényi graph) [22] has been replaced by a GECA rule operating at the edge of chaos. The idea is to harvest the potential enhanced intrinsic computational capacity of these dynamical systems to improve prediction capabilities of the ESN. Figure 5a shows a schematic representation of the ESN architecture and the corresponding substitution idea. Despite the nature of this substitution seemingly being trivial, its practical implementation is not. By definition, the ESN architecture has to deal with several problems, including the definition of the input mapping ( W i n ) and hyperparameter tuning. Both of them are still present with the GECA reservoir. Despite this, some promising results are shown in Figure 5b,c.
Just as a proof of concept, Figure 5 shows the predictions for a time series out of the chaotic Mackey–Glass system [34] for T = 500 (b) and T = 50 (c) predicition steps starting from different initial conditions, using a GECA reservoir with a rule resulting from the continuous deformation of the 4th bit in rule 46 ( ξ = 0.0174 ) [12]. In both cases, 2000 training steps and 1000 thermalization steps were used, while 100 cells were considered for the GECA. Despite the lack of hyperparameter tuning, the predictions look promising for up to roughly T = 400 , with very good agreement for early times, where the GECA seems to keep better track of the system’s past dynamics. The Mackey–Glass system is a well-known chaotic system, with a natural time scale parameter τ related to the intrinsic memory of the system’s evolution [34]. One could argue that the τ = 16.8 implies an intrinsic memory on the process that is too much to handle for the GECA past the first 50 prediction steps, where the best forecasting is found. This is further confirmed in Figure 5c, where a zoomed version of the first 50 prediction steps for a different initial condition is shown with a surprisingly accurate prediction.

6. Conclusions

In this contribution, we reviewed and extended the work carried out by our group over the last few years on dynamical system characterization and computational complexity. An information-based entropic framework was proposed for the characterization of general many-body nonlinear systems, with a focus on their computational capabilities across parameter space. The method was applied to two different models of coupled Adler-type oscillators, which allowed for the identification of several distinctive long-term regimes, including those enabling enhanced computation. Qualitative and quantitative similarities were found between the oscillators and other well-studied computationally capable systems, namely cellular automata. This further backs the argument around the computational equivalence of both systems and the potential for using the oscillators as reservoirs in functional computational architectures. This concept was effectively tested by coupling a GECA rule operating at the edge of chaos to an ESN architecture, with promising results for time series forecasting in the case of the Mackey–Glass system despite no thorough hyperparameter tuning. The best performance was, admittedly, limited to early times. Nonetheless, this opens the door for further testing and optimization for the integration of natural systems into neural network architectures.

Author Contributions

K.G.-M., D.E.-M., E.E.-R. and R.B.N. contributed equally to this work. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially funded by CITMA under the project PN223LH010-053, Deutsche Forschungsgemeinschaft (DFG) under the project DFG 585-9, Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) and the Max-Planck-Institut für Physik komplexer Systeme (MPIKS).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Supporting data is available upon request.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Strogatz, S.H.; Stewart, I. Coupled Oscillators and Biological Synchronization. Sci. Am. 1993, 269, 102–109. [Google Scholar] [CrossRef]
  2. Strogatz, S. Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering, repr. ed.; Westview Press: Cambridge, MA, USA, 2007. [Google Scholar]
  3. Mora, T.; Bialek, W. Are Biological Systems Poised at Criticality? J. Stat. Phys. 2011, 144, 268–302. [Google Scholar] [CrossRef]
  4. Machta, J. Natural Complexity, Computational Complexity and Depth. Chaos Interdiscip. J. Nonlinear Sci. 2011, 21, 037111. [Google Scholar] [CrossRef]
  5. Ott, E.; Grebogi, C.; Yorke, J.A. Controlling Chaos. Phys. Rev. Lett. 1990, 64, 1196–1199. [Google Scholar] [CrossRef]
  6. Feali, M.S.; Hamidi, A. Dynamical Response of Autaptic Izhikevich Neuron Disturbed by Gaussian White Noise. J. Comput. Neurosci. 2023, 51, 59–69. [Google Scholar] [CrossRef]
  7. Acedo, L. A Cellular Automaton Model for Collective Neural Dynamics. Math. Comput. Model. 2009, 50, 717–725. [Google Scholar] [CrossRef]
  8. Bandini, S.; Mauri, G.; Serra, R. Cellular Automata: From a Theoretical Parallel Computational Model to Its Application to Complex Systems. Parallel Comput. 2001, 27, 539–553. [Google Scholar] [CrossRef]
  9. Dennunzio, A.; Formenti, E.; Kůrka, P. Cellular Automata Dynamical Systems. In Handbook of Natural Computing; Rozenberg, G., Bäck, T., Kok, J.N., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 25–75. [Google Scholar] [CrossRef]
  10. Wolfram, S. Cellular Automata and Complexity: Collected Papers; Addison-Wesley Pub. Co.: Reading, MA, USA, 1994. [Google Scholar]
  11. Tisseur, P. Cellular Automata and Lyapunov Exponents. Nonlinearity 2000, 13, 1547–1560. [Google Scholar] [CrossRef]
  12. Estevez-Rams, E.; Estevez-Moya, D.; Garcia-Medina, K.; Lora-Serrano, R. Computational Capabilities at the Edge of Chaos for One Dimensional Systems Undergoing Continuous Transitions. Chaos Interdiscip. J. Nonlinear Sci. 2019, 29, 043105. [Google Scholar] [CrossRef]
  13. Estevez-Rams, E.; Estevez-Moya, D.; Aragón-Fernández, B. Phenomenology of Coupled Nonlinear Oscillators. Chaos Interdiscip. J. Nonlinear Sci. 2018, 28, 023110. [Google Scholar] [CrossRef]
  14. Estevez-Rams, E.; Garcia-Medina, K.; Aragón-Fernández, B. Correlation and Collective Behaviour in Adler-Type Locally Coupled Oscillators at the Edge of Chaos. Commun. Nonlinear Sci. Numer. Simul. 2024, 133, 107989. [Google Scholar] [CrossRef]
  15. García Medina, K.; Beltrán, J.L.; Estevez-Rams, E.; Kunka, D. Computational Capabilities of Adler Oscillators Under Weak Local Kuramoto-like Coupling. In Proceedings of the Progress in Artificial Intelligence and Pattern Recognition; Hernández Heredia, Y., Milián Núñez, V., Ruiz Shulcloper, J., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2024; pp. 108–118. [Google Scholar] [CrossRef]
  16. Medina, K.G.; Estevez-Rams, E. Behavior of Circular Chains of Nonlinear Oscillators with Kuramoto-like Local Coupling. AIP Adv. 2023, 13, 035222. [Google Scholar] [CrossRef]
  17. García Medina, K.; Estevez-Rams, E.; Kunka, D. Non-Linear Oscillators with Kuramoto-like Local Coupling: Complexity Analysis and Spatiotemporal Pattern Generation. Chaos Solitons Fractals 2023, 175, 114056. [Google Scholar] [CrossRef]
  18. Alonso, L.M. Complex Behavior in Chains of Nonlinear Oscillators. Chaos Interdiscip. J. Nonlinear Sci. 2017, 27, 063104. [Google Scholar] [CrossRef]
  19. Schiff, J.L. Cellular Automata: A Discrete View of the World; John Wiley & Sons: Hoboken, NJ, USA, 2011. [Google Scholar]
  20. Pedersen, J. Continuous Transitions of Cellular Automata. Complex Syst. 1990, 4, 653–665. [Google Scholar]
  21. Wolfram, S. Tables of Cellular Automaton Properties. Available online: https://content.wolfram.com/sw-publications/2020/07/cellular-automaton-properties.pdf (accessed on 30 July 2025).
  22. Jaeger, H. The “Echo State” Approach to Analysing and Training Recurrent Neural Networks; GMD Forschungszentrum Informationstechnik: Sankt Augustin, Germany, 2001. [Google Scholar] [CrossRef]
  23. Grassberger, P. Information and Complexity Measures in Dynamical Systems. In Information Dynamics; Atmanspacher, H., Scheingraber, H., Eds.; Springer: Boston, MA, USA, 1991; Volume 256, pp. 15–33. [Google Scholar] [CrossRef]
  24. Crutchfield, J.P.; Feldman, D.P. Regularities Unseen, Randomness Observed: Levels of Entropy Convergence. Chaos Interdiscip. J. Nonlinear Sci. 2003, 13, 25–54. [Google Scholar] [CrossRef]
  25. Li, M.; Chen, X.; Li, X.; Ma, B.; Vitanyi, P. The Similarity Metric. IEEE Trans. Inf. Theory 2004, 50, 3250–3264. [Google Scholar] [CrossRef]
  26. Vitányi, P.M.B.; Balbach, F.J.; Cilibrasi, R.L.; Li, M. Normalized Information Distance. In Information Theory and Statistical Learning; Emmert-Streib, F., Dehmer, M., Eds.; Springer: Boston, MA, USA, 2009; pp. 45–82. [Google Scholar] [CrossRef]
  27. Estevez-Rams, E.; Lora-Serrano, R.; Nunes, C.A.J.; Aragón-Fernández, B. Lempel-Ziv Complexity Analysis of One Dimensional Cellular Automata. Chaos Interdiscip. J. Nonlinear Sci. 2015, 25, 123106. [Google Scholar] [CrossRef] [PubMed]
  28. Winfree, A.T. Biological Rhythms and the Behavior of Populations of Coupled Oscillators. J. Theor. Biol. 1967, 16, 15–42. [Google Scholar] [CrossRef] [PubMed]
  29. Acebrón, J.A.; Bonilla, L.L.; Pérez Vicente, C.J.; Ritort, F.; Spigler, R. The Kuramoto Model: A Simple Paradigm for Synchronization Phenomena. Rev. Mod. Phys. 2005, 77, 137–185. [Google Scholar] [CrossRef]
  30. Wolfram, S. Computation Theory of Cellular Automata. Commun. Math. Phys. 1984, 96, 15–57. [Google Scholar] [CrossRef]
  31. Crutchfield, J.P. What Lies Between Order and Chaos? In Art and Complexity; Elsevier: Amsterdam, The Netherlands, 2003; pp. 31–45. [Google Scholar] [CrossRef]
  32. Crutchfield, J.P.; Young, K. Inferring Statistical Complexity. Phys. Rev. Lett. 1989, 63, 105–108. [Google Scholar] [CrossRef] [PubMed]
  33. Feldman, D.P.; McTague, C.S.; Crutchfield, J.P. The Organization of Intrinsic Computation: Complexity-Entropy Diagrams and the Diversity of Natural Information Processing. Chaos Interdiscip. J. Nonlinear Sci. 2008, 18, 043106. [Google Scholar] [CrossRef] [PubMed]
  34. Berezansky, L.; Braverman, E.; Idels, L. The Mackey–Glass Model of Respiratory Dynamics: Review and New Results. Nonlinear Anal. Theory Methods Appl. 2012, 75, 6034–6052. [Google Scholar] [CrossRef]
Figure 1. (a) Entropic landscape across parameter space for the LAP model. Several distinctive dynamical regimes were identified, with the wedge and needle regions being the most promising regarding computational power, as indicated by their entropic markers values. (b) Enhanced computational power was verified in the border between the needle and chaotic regions, as indicated by a sudden jump in effective complexity at the verge of a transition towards chaos. The emergence of long-lived long-range patterns becomes evident in binary versions of spacetime matrices at representative points (a, b and c) in parameter space (taken and modified from [12]).
Figure 1. (a) Entropic landscape across parameter space for the LAP model. Several distinctive dynamical regimes were identified, with the wedge and needle regions being the most promising regarding computational power, as indicated by their entropic markers values. (b) Enhanced computational power was verified in the border between the needle and chaotic regions, as indicated by a sudden jump in effective complexity at the verge of a transition towards chaos. The emergence of long-lived long-range patterns becomes evident in binary versions of spacetime matrices at representative points (a, b and c) in parameter space (taken and modified from [12]).
Csmf 11 00017 g001
Figure 2. (a) Entropic landscape across parameter space for the LAK model. Several distinctive dynamical regimes were also identified in this case, with the two main candidates for natural computation being labelled as complex regions (II and III). This is further supported by their entropic measures and spacetime matrices [17]. (b) Enhanced computation was also detected for this model in the boundary between the absorbing and complex (III) regions, with the latter proven to be highly sensitive to initial conditions as indicated by d. Enhanced information handling is not necessarily evident in the corresponding spacetime matrix ( γ = 2.832 ), yet it is captured by the proposed information-based framework (taken and modified from [17]).
Figure 2. (a) Entropic landscape across parameter space for the LAK model. Several distinctive dynamical regimes were also identified in this case, with the two main candidates for natural computation being labelled as complex regions (II and III). This is further supported by their entropic measures and spacetime matrices [17]. (b) Enhanced computation was also detected for this model in the boundary between the absorbing and complex (III) regions, with the latter proven to be highly sensitive to initial conditions as indicated by d. Enhanced information handling is not necessarily evident in the corresponding spacetime matrix ( γ = 2.832 ), yet it is captured by the proposed information-based framework (taken and modified from [17]).
Csmf 11 00017 g002
Figure 3. The stability region of the globally synchronized solution for the LAK model with N = 500 coupled units. Four representative points in parameter space are signalled, and their corresponding spacetime matrices are also shown (top row on the right). The different dynamics detected through entropic measures become evident upon visual inspection of the binary versions of the spacetime matrices. Furthermore, for almost every long-term dynamic identified in the LAK model, a visually equivalent instance can be found in different complexity classes [10,19,21,30] of ECA space, as seen in the bottom row (taken from [15]).
Figure 3. The stability region of the globally synchronized solution for the LAK model with N = 500 coupled units. Four representative points in parameter space are signalled, and their corresponding spacetime matrices are also shown (top row on the right). The different dynamics detected through entropic measures become evident upon visual inspection of the binary versions of the spacetime matrices. Furthermore, for almost every long-term dynamic identified in the LAK model, a visually equivalent instance can be found in different complexity classes [10,19,21,30] of ECA space, as seen in the bottom row (taken from [15]).
Csmf 11 00017 g003
Figure 4. (a) Entropy–complexity diagrams for continuously deformed rule 78 (top) and LAP model (bottom). The diagram corresponding to the GECA was built by continuous deformation of the 6th bit in the ECA rule, while the diagram corresponding to the LAP model was obtained by randomly sampling the parameter space. Final entropic values are plotted in both cases. Both curves are very similar in shape, characteristic of systems undergoing transitions towards chaotic regimes [12], with the bottom one being smoother, indicating the continuous nature of the first derivative in parameter space. (b) Behaviour of entropic measures (h and E) and symbol density ( ρ ) along the ω = 2 line in parameter space. Specific γ values appear highlighted with dashed lines, while visually and dynamically equivalent ECA rules appear signalled with solid colour dots. The closer the dots to the curves, the more similar the systems in question are (taken and modified from [12,15]).
Figure 4. (a) Entropy–complexity diagrams for continuously deformed rule 78 (top) and LAP model (bottom). The diagram corresponding to the GECA was built by continuous deformation of the 6th bit in the ECA rule, while the diagram corresponding to the LAP model was obtained by randomly sampling the parameter space. Final entropic values are plotted in both cases. Both curves are very similar in shape, characteristic of systems undergoing transitions towards chaotic regimes [12], with the bottom one being smoother, indicating the continuous nature of the first derivative in parameter space. (b) Behaviour of entropic measures (h and E) and symbol density ( ρ ) along the ω = 2 line in parameter space. Specific γ values appear highlighted with dashed lines, while visually and dynamically equivalent ECA rules appear signalled with solid colour dots. The closer the dots to the curves, the more similar the systems in question are (taken and modified from [12,15]).
Csmf 11 00017 g004
Figure 5. (a) Schematic representation of the general ESN architecture (top) and the substitution idea (bottom). The conventional reservoir, often a Watts–Strogatz or Erdős–Rényi graph, is substituted by an alternative computationally capable dynamical system. (b) Proof of concept of dynamical system integration in ESN architectures: prediction made by the network, for a Mackey–Glass time series with τ = 16.8 , using a GECA reservoir with a rule resulting from the continuous deformation of the 4th bit in rule 46 ( ξ = 0.0174 ). Predictions look promising up to T = 400 , with the best forecasting limited to the early times, as shown in (c) for a different initial condition.
Figure 5. (a) Schematic representation of the general ESN architecture (top) and the substitution idea (bottom). The conventional reservoir, often a Watts–Strogatz or Erdős–Rényi graph, is substituted by an alternative computationally capable dynamical system. (b) Proof of concept of dynamical system integration in ESN architectures: prediction made by the network, for a Mackey–Glass time series with τ = 16.8 , using a GECA reservoir with a rule resulting from the continuous deformation of the 4th bit in rule 46 ( ξ = 0.0174 ). Predictions look promising up to T = 400 , with the best forecasting limited to the early times, as shown in (c) for a different initial condition.
Csmf 11 00017 g005
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

García-Medina, K.; Estevez-Moya, D.; Estevez-Rams, E.; Neder, R.B. Emergent Behavior and Computational Capabilities in Nonlinear Systems: Advancing Applications in Time Series Forecasting and Predictive Modeling. Comput. Sci. Math. Forum 2025, 11, 17. https://doi.org/10.3390/cmsf2025011017

AMA Style

García-Medina K, Estevez-Moya D, Estevez-Rams E, Neder RB. Emergent Behavior and Computational Capabilities in Nonlinear Systems: Advancing Applications in Time Series Forecasting and Predictive Modeling. Computer Sciences & Mathematics Forum. 2025; 11(1):17. https://doi.org/10.3390/cmsf2025011017

Chicago/Turabian Style

García-Medina, Kárel, Daniel Estevez-Moya, Ernesto Estevez-Rams, and Reinhard B. Neder. 2025. "Emergent Behavior and Computational Capabilities in Nonlinear Systems: Advancing Applications in Time Series Forecasting and Predictive Modeling" Computer Sciences & Mathematics Forum 11, no. 1: 17. https://doi.org/10.3390/cmsf2025011017

APA Style

García-Medina, K., Estevez-Moya, D., Estevez-Rams, E., & Neder, R. B. (2025). Emergent Behavior and Computational Capabilities in Nonlinear Systems: Advancing Applications in Time Series Forecasting and Predictive Modeling. Computer Sciences & Mathematics Forum, 11(1), 17. https://doi.org/10.3390/cmsf2025011017

Article Metrics

Back to TopTop