entropy-logo

Journal Browser

Journal Browser

Dynamical Systems and Brain Inspired Computing

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Signal and Data Analysis".

Deadline for manuscript submissions: closed (20 October 2021) | Viewed by 6038

Special Issue Editors


E-Mail Website
Guest Editor
Laboratoire d’Information Quantique CP224, Université libre de Bruxelles, 1050 Brussels, Belgium
Interests: quantum information theory; experimental quantum and non linear optics; machine learning

E-Mail Website
Guest Editor
Applied Physics Research Group, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium
Interests: modeling and nonlinear dynamics of semiconductor lasers; synchronization phenomena and bio-inspired information processing

E-Mail Website
Guest Editor
LMOPS EA 4423 Laboratory, CentraleSupélec and Université de Lorraine, F-57000 Metz, France
Interests: reservoir computing; photonic computing; FPGA; computer vision

Special Issue Information

The brain is one of the greatest enigmas of our time. It can deal with novelty and ambiguity and solve hard tasks such as visual scenery analysis and locomotion control. This is carried out with very low energy consumption, using a huge number of neurons operating in parallel – the complete opposite of the Von Neumann architecture used in digital computers. Recent spectacular advances in artificial intelligence show that brain-inspired algorithms can solve problems that appear unfeasible with traditional approaches. In parallel, Moore’s law is reaching its limit, due to the physical barrier to further miniaturize integrated electrical circuits. These facts encourage scientists to develop novel brain-inspired paradigms for machine computing. The challenge is to develop powerful new algorithms for solving hard tasks; to gain deeper insights into how dynamical systems, including the brain, process information; and to develop novel unconventional hardware for fast efficient information processing.

Recent progress, both theoretical and experimental, has invigorated the field. Topics such as reservoir computing, Ising machines, accelerators for artificial neural networks, are addressing the above challenge. These are often based on high-dimensional dynamical systems that exhibit a transition from stability to deterministic chaos – the kind of systems studied by physicists interested in complex systems and also the kind of structures that may exist in the brain. These paradigms intimately connect the fields of physics, information theory, computer science, and engineering. They raise new questions, such as the following: How to measure the computational power of such unconventional computing systems? How to train them to execute useful tasks? What applications are best suited for these novel computing paradigms? What kinds of hardware are best suited for implementations?

This Special Issue focuses on the above questions and the challenges in the development of novel computing architectures inspired by artificial intelligence algorithms and by how the brain processes information. The topics include, but are not limited to the following:

  • Reservoir computing, Ising machines, and accelerators for artificial neural networks
  • Novel methods for using dynamical systems for information processing
  • Theoretical analysis of information processing capability of dynamical systems, including methods based on information theory and entropy
  • Experimental implementations of brain-inspired computing, including optical and (unconventional) electronic implementations
  • Practical applications of such systems, for instance to telecommunications
  • Connections to other areas such as neuroscience or soft robotics

Prof. Dr. Serge Massar
Prof. Dr. Guy Van der Sande
Dr. Piotr Antonik
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

13 pages, 2641 KiB  
Article
Reservoir Computing with Delayed Input for Fast and Easy Optimisation
by Lina Jaurigue, Elizabeth Robertson, Janik Wolters and Kathy Lüdge
Entropy 2021, 23(12), 1560; https://doi.org/10.3390/e23121560 - 23 Nov 2021
Cited by 24 | Viewed by 3614
Abstract
Reservoir computing is a machine learning method that solves tasks using the response of a dynamical system to a certain input. As the training scheme only involves optimising the weights of the responses of the dynamical system, this method is particularly suited for [...] Read more.
Reservoir computing is a machine learning method that solves tasks using the response of a dynamical system to a certain input. As the training scheme only involves optimising the weights of the responses of the dynamical system, this method is particularly suited for hardware implementation. Furthermore, the inherent memory of dynamical systems which are suitable for use as reservoirs mean that this method has the potential to perform well on time series prediction tasks, as well as other tasks with time dependence. However, reservoir computing still requires extensive task-dependent parameter optimisation in order to achieve good performance. We demonstrate that by including a time-delayed version of the input for various time series prediction tasks, good performance can be achieved with an unoptimised reservoir. Furthermore, we show that by including the appropriate time-delayed input, one unaltered reservoir can perform well on six different time series prediction tasks at a very low computational expense. Our approach is of particular relevance to hardware implemented reservoirs, as one does not necessarily have access to pertinent optimisation parameters in physical systems but the inclusion of an additional input is generally possible. Full article
(This article belongs to the Special Issue Dynamical Systems and Brain Inspired Computing)
Show Figures

Figure 1

19 pages, 1091 KiB  
Article
Photonic Reservoir Computer with Output Expansion for Unsupervized Parameter Drift Compensation
by Jaël Pauwels, Guy Van der Sande, Guy Verschaffelt and Serge Massar
Entropy 2021, 23(8), 955; https://doi.org/10.3390/e23080955 - 26 Jul 2021
Cited by 4 | Viewed by 1797
Abstract
We present a method to improve the performance of a reservoir computer by keeping the reservoir fixed and increasing the number of output neurons. The additional neurons are nonlinear functions, typically chosen randomly, of the reservoir neurons. We demonstrate the interest of this [...] Read more.
We present a method to improve the performance of a reservoir computer by keeping the reservoir fixed and increasing the number of output neurons. The additional neurons are nonlinear functions, typically chosen randomly, of the reservoir neurons. We demonstrate the interest of this expanded output layer on an experimental opto-electronic system subject to slow parameter drift which results in loss of performance. We can partially recover the lost performance by using the output layer expansion. The proposed scheme allows for a trade-off between performance gains and system complexity. Full article
(This article belongs to the Special Issue Dynamical Systems and Brain Inspired Computing)
Show Figures

Figure 1

Back to TopTop