# Precise Traits from Sloppy Components: Perception and the Origin of Phenotypic Response

## Abstract

**:**

## 1. Introduction

#### Background and Literature

## 2. Materials and Methods

#### 2.1. Perception and Response

#### 2.2. Chaotic Dynamics

#### 2.3. Random Reservoir

## 3. Results

#### 3.1. Predicting Future Inputs

#### 3.2. Critical Learning Period

#### 3.3. Other Ideas for Future Study

## 4. Conclusions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Maass, W.; Natschläger, T.; Markram, H. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Comput.
**2002**, 14, 2531–2560. [Google Scholar] [CrossRef] [PubMed] - Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Jaeger, H. Echo state network. Scholarpedia
**2007**, 2, 2330. [Google Scholar] [CrossRef] - Gauthier, D.J.; Bollt, E.; Griffith, A.; Barbosa, W.A. Next generation reservoir computing. Nat. Commun.
**2021**, 12, 5564. [Google Scholar] [PubMed] - Cucchi, M.; Abreu, S.; Ciccone, G.; Brunner, D.; Kleemann, H. Hands-on reservoir computing: A tutorial for practical implementation. Neuromorphic Comput. Eng.
**2022**, 2, 032002. [Google Scholar] [CrossRef] - Damicelli, F.; Hilgetag, C.C.; Goulas, A. Brain connectivity meets reservoir computing. PLoS Comput. Biol.
**2022**, 18, e1010639. [Google Scholar] [CrossRef] [PubMed] - Goudarzi, A.; Lakin, M.R.; Stefanovic, D. DNA reservoir computing: A novel molecular computing approach. In International Workshop on DNA-Based Computers; Springer: Berlin/Heidelberg, Germany, 2013; pp. 76–89. [Google Scholar]
- Yahiro, W.; Aubert-Kato, N.; Hagiya, M. A reservoir computing approach for molecular computing. In Proceedings of the ALIFE 2018: The 2018 Conference on Artificial Life, Tokyo, Japan, 23–27 July 2018; pp. 31–38. [Google Scholar]
- Loeffler, A.; Zhu, R.; Hochstetter, J.; Li, M.; Fu, K.; Diaz-Alvarez, A.; Nakayama, T.; Shine, J.M.; Kuncic, Z. Topological properties of neuromorphic nanowire networks. Front. Neurosci.
**2020**, 14, 184. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Loeffler, A.; Zhu, R.; Hochstetter, J.; Diaz-Alvarez, A.; Nakayama, T.; Shine, J.M.; Kuncic, Z. Modularity and multitasking in neuro-memristive reservoir networks. Neuromorphic Comput. Eng.
**2021**, 1, 014003. [Google Scholar] [CrossRef] - Lukoševičius, M. A practical guide to applying echo state networks. In Neural Networks: Tricks of the Trade; Montavon, G., Orr, G.B., Müller, K.R., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 659–686. [Google Scholar]
- Seoane, L.F. Evolutionary aspects of reservoir computing. Philos. Trans. R. Soc. B
**2019**, 374, 20180377. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Solé, R.; Seoane, L.F. Evolution of brains and computers: The roads not taken. Entropy
**2022**, 24, 665. [Google Scholar] [CrossRef] [PubMed] - Lorenz, E.N. Predictability—A problem partly solved. In Predictability of Weather and Climate; Palmer, T., Hagedorn, R., Eds.; Cambridge Univerity Press: Cambridge, UK, 2006; pp. 40–58. [Google Scholar]
- Karimi, A.; Paul, M.R. Extensive chaos in the Lorenz–96 model. Chaos
**2010**, 20, 043105. [Google Scholar] [CrossRef] [Green Version] - Bedrossian, J.; Blumenthal, A.; Punshon-Smith, S. A regularity method for lower bounds on the Lyapunov exponent for stochastic differential equations. Invent. Math.
**2022**, 227, 429–516. [Google Scholar] [CrossRef] - Datseris, G. DynamicalSystems.jl: A Julia software library for chaos and nonlinear dynamics. J. Open Source Softw.
**2018**, 3, 598. [Google Scholar] [CrossRef] - Martinuzzi, F.; Rackauckas, C.; Abdelrehim, A.; Mahecha, M.D.; Mora, K. ReservoirComputing.jl: An efficient and modular library for reservoir computing models. J. Mach. Learn. Res.
**2022**, 23, 13093–13100. [Google Scholar] [CrossRef] - Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res.
**2011**, 12, 2825–2830. [Google Scholar] - Blaom, A.D.; Kiraly, F.; Lienart, T.; Simillides, Y.; Arenas, D.; Vollmer, S.J. MLJ: A Julia package for composable machine learning. J. Open Source Softw.
**2020**, 5, 2704. [Google Scholar] [CrossRef] - Frank, S.A. Robustness and complexity. arXiv
**2023**. [Google Scholar] [CrossRef] - Levin, M. Life, death, and self: Fundamental questions of primitive cognition viewed through the lens of body plasticity and synthetic organisms. Biochem. Biophys. Res. Commun.
**2021**, 564, 114–133. [Google Scholar]

**Figure 1.**Estimate for the relative speed of chaotic divergence in the dynamics of the Lorenz–96 equations given in Equation (1), with $N=5$. Here, the Lyapunov exponent, $\lambda $, estimates the relative divergence rate. The analysis in this article focuses on the doubling time for divergence, $\mathit{dbl}=\mathrm{log}2/\lambda $, in which a lower doubling time means that future values of the trajectory are harder to predict. For a few limited regions of smaller F values, the estimated Lyapunov exponent drops below the trend. Those deviations may arise from numerical limitations or a complex pattern of nearly stable periodicity. Sufficiently complex periodicity poses a significant challenge for prediction. The analyses in this article avoid those erratic regions.

**Figure 2.**Temporal dynamics of environmental state (blue) and system prediction for the environmental state (gold). At each time point, the internal system uses the information in its reservoir to predict the environmental state shift time units into the future. The gold prediction curve is shifted to the right by shift time units, so that the closeness of the match between the two curves describes the quality of the predictions. Above each panel, the parameters N and F describe the environmental dynamics in Equation (1); dbl gives the doubling time for the deviation distance of a small perturbation to the dynamics; res is the reservoir size; and R2_tr and R2_ts are the R-squared values that describe the percentage of the variation in the blue dynamics curve captured by the gold prediction curve for the training and test periods, respectively, as described in the text. The panels (

**a**–

**c**) have corresponding labels on the curves in Figure 3a. Time units are nondimensional and can be chosen to match the scaling of the environmental process under study. Here, the plots show the 20 time units at the end of the test period of the machine learning procedure used to generate the curves. The abbreviations res, shift, size, dbl, R2_tr, and R2_ts denote variables. Execution times for the parameters in (

**b**) with reservoir sizes (res) of $25,50$, and 100 are approximately 58 s, 118 s, and 253 s. Timing was carried out on Apple Mac Studio M1 Ultra with Julia 1.9.1, source code git commit a7f74f1. The code was not optimized for execution speed.

**Figure 3.**Prediction of future environmental state based on the information in a random reservoir network. Figure 2 shows the environmental dynamics and the prediction challenge. In this figure, the y-axis measures the percentage of the total variance (R-squared) in the environmental state explained by the predictions generated from the internal reservoir, reflecting the potential for adaptive response. The x-axis shows the intrinsic predictability of the environment, measured by the time required to double a small initial perturbation to the dynamic trajectory. The different colored lines describe the time shift into the future at which predictions are compared to actual future dynamics. The res_size parameter in each panel gives the size of the random reservoir. The a, b, and c labels in panel (

**a**) match the corresponding panels in Figure 2. Each line connects the outcomes at the following 11 approximate doubling times: $0.52,0.54,0.58,0.64,0.70,0.77,0.86,0.90,0.99,1.15,$ and $1.43$. Panels (

**a**–

**c**) show three different reservoir sizes denoted by the res_size parameter labels.

**Figure 4.**Increasing reservoir size provides better predictions for future environmental state. The analysis follows the methods used in Figure 3. Here, dbl_lo denotes a doubling time of approximately 0.52, and dbl_hi denotes a doubling time of approximately 1.42. The value of shift_lo denotes a prediction into the future over 1.0 time units, and shift_hi denotes a prediction into the future over 2.0 time units. The six different reservoir sizes used in the computer runs are shown as labels for the tick marks along the x-axis.

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Frank, S.A.
Precise Traits from Sloppy Components: Perception and the Origin of Phenotypic Response. *Entropy* **2023**, *25*, 1162.
https://doi.org/10.3390/e25081162

**AMA Style**

Frank SA.
Precise Traits from Sloppy Components: Perception and the Origin of Phenotypic Response. *Entropy*. 2023; 25(8):1162.
https://doi.org/10.3390/e25081162

**Chicago/Turabian Style**

Frank, Steven A.
2023. "Precise Traits from Sloppy Components: Perception and the Origin of Phenotypic Response" *Entropy* 25, no. 8: 1162.
https://doi.org/10.3390/e25081162