The Resonant Brain: A Themed Issue Dedicated to Professor Stephen Grossberg
A special issue of Information (ISSN 2078-2489). This special issue belongs to the section "Artificial Intelligence".
Deadline for manuscript submissions: 30 September 2024 | Viewed by 17521
Special Issue Editors
Interests: cognitive neuroscience; brain; cognitive psychology; behavior; perceptual learning and memory; neural networks; consciousness; philosophy of artificial intelligence; principles of unsupervised learning; computing and philosophy
Special Issues, Collections and Topics in MDPI journals
Special Issue Information
Dear Colleagues,
This Special Issue is centered on answering how the human brain processes information in all its complexity to give rise to what is called “the mind”. Information is a concept that reaches well beyond the realm of data, and in the largest possible sense it may be defined as everything that we are able to process in our brains to produce knowledge. Clearly, the concept of information per se would not exist without a human brain capable of perceiving and processing it, and there would be no definition of what is to be understood by the concept without a conscious mind capable of providing it. Information produces what we humans call knowledge, and knowledge is formed inside, and transformed by, what we call the human mind. Yet, how can a mind understand itself? How is it possible to understand the processes our brain uses to understand the world? Professor Stephen Grossberg’s book Conscious Mind, Resonant Brain: How Each Brain Makes a Mind, published in 2021, provides an introductory and self-contained description of some of the exciting answers to these questions that modern theories of mind and brain have recently proposed. Stephen Grossberg is broadly acknowledged to be among the pioneers and research leaders in the field of neural networks who has, for the past 50 years, modeled how brains give rise to minds, and notably how neural circuits in multiple brain regions interact together to generate psychological functions. This research has led to a unified understanding of how, where, and why our brains can consciously see, hear, feel, and know about the world, and effectively plan and act within it. His lifelong work has sought to clarify how autonomous adaptive intelligence is achieved. It provides mechanistic explanations of adaptive behaviors, and solutions to large-scale problems in machine learning, technology, and Artificial Intelligence that provide a blueprint for autonomously intelligent algorithms and robots. As brains embody a universal developmental code, unifying insights also emerge about shared laws that are found in all living cellular tissues, from the most primitive to the most advanced, notably how the laws governing networks of interacting cells support developmental and learning processes in all species. The fundamental brain design principles of complementarity, uncertainty, and resonance that Grossberg has discovered also reflect laws of the physical world with which our brains ceaselessly interact, and which enable our brains to incrementally learn to understand those laws, thereby enabling humans to understand the world scientifically.
This Special Issue is dedicated to Stephen Grossberg, Professor Emeritus with the Department of Biomedical Engineering of Boston University (BU), Wang Professor of Cognitive and Neural Systems, and former Director of the Center for Adaptive Systems of BU. Grossberg is an internationally acclaimed scientist and pioneer in fundamental principles, mechanisms, and model architectures that form the foundation of contemporary neural network research. Grossberg and his colleagues have built models that have been used to analyze and predict interdisciplinary data about mind and brain and suggest novel architectures for technological applications. In this Special Issue, we invite articles on interdisciplinary topics devoted to cooperative–competitive processes underlying brain integration for human information processing and the design of predictive Artificial Intelligence and/or conscious representation by the human mind.
Both research papers and review articles are welcome.
Prof. Dr. Birgitta Dresp-Langley
Prof. Dr. Luiz Pessoa
Guest Editors
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Information is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
Benefits of Publishing in a Special Issue
- Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
- Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
- Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
- External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
- e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.
Further information on MDPI's Special Issue polices can be found here.
Planned Papers
The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.
Title: How focusing attention aids auditory processing
Authors: Adam Reeves
Affiliation: Department of Psychology, Northeastern University, Boston, MA 02115, USA
Abstract: What is the effect of focusing auditory attention on an upcoming signal tone? Weak signal tones, 40 ms in duration, were presented in 50 dB continuous white noise and either uncued or cued 82 ms beforehand by a 12 dB SL cue tone of the same frequency and duration as the signal. Signal frequency was either constant for a block of trials or was randomly one of 11 frequencies from 632 to 3140 Hz. Slopes of psychometric functions for detection in single-interval (Yes/No) trials were obtained from listeners by varying signal level over a 1 - 9 dB range. Plots of log(d’) against signal dB were fit by linear functions. Results: (1) Slopes were the same whether signal frequency was constant or varied, as found by Green (1961). (2) Slopes for uncued tones increased by 14% to 20% more than predicted by signal energy. (3) Slopes for cued tones followed signal energy contaminated by weak internal noise. Conclusion: valid pre-cues help attention focus rapidly on signal frequency and permit listeners to act as near-ideal detectors of signal energy, supporting a key hypothesis of Grossberg’s ART model that attention guided by conscious awareness can optimize performance.
Title: From Information to Knowledge: The Storage and Role of Knowledge in Neural Networks
Authors: Jagmeet S. Kanwal
Affiliation: Georgetown University School of Medicine
Abstract: Summary The brain stores and retrieves information from memories via the sensory information received through the peripheral nervous system and from activity of neurons within the central nervous system. Short-term memories are stored in the hippocampus and long-term memories are likely distributed throughout the brain in the form of neural networks. Internally generated neural activity and information received from external stimuli together guide organism's behavior and decision making. It is theorized that the brain functions like a phonebook that can dial a particular network to access a specific memory. However, memories are only useful when they are stored as knowledge, and not as desperate bits of information. The conversion of information and memories into knowledge and the processes by which it plays a role in explicit and implicit decision making remains largely unclear. Knowledge is stored in these networks and when acquired, can be used to direct behavior. Additionally, various levels of attention play a role in how knowledge is acquired and used. Lastly, sleep helps protect developed knowledge and prevent neural networks from being modified. A deeper understanding of knowledge networks is not only important from in artificial intelligence perspective, but also has practical implications in how education must be designed to create knowledge rather than disconnected bits of information. Highlights This perspective article examines the difference between information, memory, and knowledge. it is proposed that knowledge networks are important for decision-making and survival of a species. These decision networks depend upon re-afference of information from multiple brain regions, and this constellation of networks must work together to recall and expand knowledge within a system exhibiting intelligence. Knowledge must be stored in such a way that it can be accessed by multiple agents, tokens, or handles at any time. The emergence of knowledge networks and their appropriate usage underlies the evolution of complex brains that can imagine, predict and expand knowledge.
Title: Trustworthy Self-Organizing AI for Rapid Detection of Critical Changes in Landscapes: Las Vegas Across the Years
Authors: John M. Wandeto and Birgitta Dresp-Langley
Affiliation: --
Abstract: Self-organization is the major prinicple of all biological learning and a core concept in Adaptive Resonance Theory, which has been widely used for modeling intelligent system learning of visual data. Visual images may reveal important aspects of physical change across time. An AI system that detects such change automatically in environmental image data representing natural or urban landscapes may be to the service of citizens, historians, or policymakers. Here, we exploit an Artificial Intelligence (AI) algorithm with unsupervised Self Organized Map (SOM) learning, and, more specifically, its quantization error (QE). The QE is a SOM output metric that has proven consistently reliable and to-the-pixel precise in the detection of the smallest relevant changes in large image data. We applied the algorithm to analyses of satellite images of Las Vegas, generated across the years 1984-2008, a period of major restructuration of the urban landscape. Statistical trend analyses show how the QE from SOM learning of image data for specific geographic regions of interest can be exploited for detecting the magnitude and the direction of structural changes across the years of the reference time period, significantly correlated with demographic data for the same reference time period. The AI approach satisfies the seven criteria for "trustworthiness" as stipulated in the ALTAI (Assessment List for Trustworthy AI) of the European Commission and can easily be implemented.
Title: Resonance and Adaptation: Inspirations to Explore Auditory-Motor Coupling in Language Acquisition
Authors: Yang Zhang
Affiliation: Department of Speech-Language-Hearing Sciences & Masonic Institute for the Developing Brain, University of Minnesota, Minneapolis, MN 55455, USA
Abstract: Understanding the intricate interplay between auditory perception and motor control in speech production is a central goal in cognitive and neurolinguistic research. This article examines the coupling between these processes within the framework of Adaptive Resonance Theory (ART), shedding light on how the brain achieves stable and meaningful representations in the dynamic context of speech. Drawing inspiration from ART's core tenets, we delve into how auditory-motor coupling impacts speech perception accuracy and fluency. We explore evidence from behavioral experiments that reveal how predictions and sensory cues interact to shape speech production. Furthermore, we discuss experiential and developmental factors in shaping behavioral and neural measures that elucidate the brain structures and mechanisms implicated in auditory-motor integration. This synthesis underscores ART's relevance in explicating how the brain resonates with auditory signals and motor intentions, while adaptively adjusting to maintain coherence and stability. By embracing ART's principles, we bridge the gap between cognitive theories and empirical research, offering a comprehensive developmental framework to decipher the intricate dynamics of speech perception and language acquisition within the rich context of auditory-motor coupling for a better understanding of individual differences in relation to language delay and disorders.