Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (8)

Search Parameters:
Keywords = quantum spiking neural network

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
13 pages, 17818 KiB  
Article
Optical Mapping and On-Demand Selection of Local Hysteresis Properties in VO2
by Melissa Alzate Banguero, Sayan Basak, Nicolas Raymond, Forrest Simmons, Pavel Salev, Ivan K. Schuller, Lionel Aigouy, Erica W. Carlson and Alexandre Zimmers
Condens. Matter 2025, 10(1), 12; https://doi.org/10.3390/condmat10010012 - 13 Feb 2025
Viewed by 1275
Abstract
Quantum materials have tremendous potential for disruptive applications. However, scaling devices down has been challenging due to electronic inhomogeneities in many of these materials. Understanding and controlling these electronic patterns on a local scale has thus become crucial to further new applications. To [...] Read more.
Quantum materials have tremendous potential for disruptive applications. However, scaling devices down has been challenging due to electronic inhomogeneities in many of these materials. Understanding and controlling these electronic patterns on a local scale has thus become crucial to further new applications. To address this issue, we have developed a new optical microscopy method that allows for the precise quasi-continuous filming of the insulator-to-metal transition in VO­2 with fine temperature steps. This enables us to track metal and insulator domains over thousands of images and quantify, for the first time, the local hysteresis properties of VO­2 thin films. The analysis of the maps has allowed us to quantify cycle-to-cycle reproducibility of the local transitions and reveals a positive correlation between the local insulator–metal transition temperatures T­c and the local hysteresis widths ΔTc. These maps also enable the optical selection of regions of high or low transition temperature in combination with large or nearly absent local hysteresis. These maps pave the way to understand and use stochasticity to advantage in these materials by picking on-demand transition properties, allowing the scaling down of devices such as optical switches, infrared microbolometers and spiking neural networks. Full article
(This article belongs to the Special Issue Superstripes Physics, 3rd Edition)
Show Figures

Figure 1

26 pages, 1456 KiB  
Article
Brain-Inspired Agents for Quantum Reinforcement Learning
by Eva Andrés, Manuel Pegalajar Cuéllar and Gabriel Navarro
Mathematics 2024, 12(8), 1230; https://doi.org/10.3390/math12081230 - 19 Apr 2024
Viewed by 2826
Abstract
In recent years, advancements in brain science and neuroscience have significantly influenced the field of computer science, particularly in the domain of reinforcement learning (RL). Drawing insights from neurobiology and neuropsychology, researchers have leveraged these findings to develop novel mechanisms for understanding intelligent [...] Read more.
In recent years, advancements in brain science and neuroscience have significantly influenced the field of computer science, particularly in the domain of reinforcement learning (RL). Drawing insights from neurobiology and neuropsychology, researchers have leveraged these findings to develop novel mechanisms for understanding intelligent decision-making processes in the brain. Concurrently, the emergence of quantum computing has opened new frontiers in artificial intelligence, leading to the development of quantum machine learning (QML). This study introduces a novel model that integrates quantum spiking neural networks (QSNN) and quantum long short-term memory (QLSTM) architectures, inspired by the complex workings of the human brain. Specifically designed for reinforcement learning tasks in energy-efficient environments, our approach progresses through two distinct stages mirroring sensory and memory systems. In the initial stage, analogous to the brain’s hypothalamus, low-level information is extracted to emulate sensory data processing patterns. Subsequently, resembling the hippocampus, this information is processed at a higher level, capturing and memorizing correlated patterns. We conducted a comparative analysis of our model against existing quantum models, including quantum neural networks (QNNs), QLSTM, QSNN and their classical counterparts, elucidating its unique contributions. Through empirical results, we demonstrated the effectiveness of utilizing quantum models inspired by the brain, which outperform the classical approaches and other quantum models in optimizing energy use case. Specifically, in terms of average, best and worst total reward, test reward, robustness, and learning curve. Full article
Show Figures

Figure 1

32 pages, 2030 KiB  
Article
Generalized Neuromorphism and Artificial Intelligence: Dynamics in Memory Space
by Said Mikki
Symmetry 2024, 16(4), 492; https://doi.org/10.3390/sym16040492 - 18 Apr 2024
Cited by 1 | Viewed by 2016
Abstract
This paper introduces a multidisciplinary conceptual perspective encompassing artificial intelligence (AI), artificial general intelligence (AGI), and cybernetics, framed within what we call the formalism of generalized neuromorphism. Drawing from recent advancements in computing, such as neuromorphic computing and spiking neural networks, as well [...] Read more.
This paper introduces a multidisciplinary conceptual perspective encompassing artificial intelligence (AI), artificial general intelligence (AGI), and cybernetics, framed within what we call the formalism of generalized neuromorphism. Drawing from recent advancements in computing, such as neuromorphic computing and spiking neural networks, as well as principles from the theory of open dynamical systems and stochastic classical and quantum dynamics, this formalism is tailored to model generic networks comprising abstract processing events. A pivotal aspect of our approach is the incorporation of the memory space and the intrinsic non-Markovian nature of the abstract generalized neuromorphic system. We envision future computations taking place within an expanded space (memory space) and leveraging memory states. Positioned at a high abstract level, generalized neuromorphism facilitates multidisciplinary applications across various approaches within the AI community. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

19 pages, 1473 KiB  
Article
Magnetic Flux Sensor Based on Spiking Neurons with Josephson Junctions
by Timur Karimov, Valerii Ostrovskii, Vyacheslav Rybin, Olga Druzhina, Georgii Kolev and Denis Butusov
Sensors 2024, 24(7), 2367; https://doi.org/10.3390/s24072367 - 8 Apr 2024
Cited by 10 | Viewed by 2007
Abstract
Josephson junctions (JJs) are superconductor-based devices used to build highly sensitive magnetic flux sensors called superconducting quantum interference devices (SQUIDs). These sensors may vary in design, being the radio frequency (RF) SQUID, direct current (DC) SQUID, and hybrid, such as D-SQUID. In addition, [...] Read more.
Josephson junctions (JJs) are superconductor-based devices used to build highly sensitive magnetic flux sensors called superconducting quantum interference devices (SQUIDs). These sensors may vary in design, being the radio frequency (RF) SQUID, direct current (DC) SQUID, and hybrid, such as D-SQUID. In addition, recently many of JJ’s applications were found in spiking models of neurons exhibiting nearly biological behavior. In this study, we propose and investigate a new circuit model of a sensory neuron based on DC SQUID as part of the circuit. The dependence of the dynamics of the designed model on the external magnetic flux is demonstrated. The design of the circuit and derivation of the corresponding differential equations that describe the dynamics of the system are given. Numerical simulation is used for experimental evaluation. The experimental results confirm the applicability and good performance of the proposed magnetic-flux-sensitive neuron concept: the considered device can encode the magnetic flux in the form of neuronal dynamics with the linear section. Furthermore, some complex behavior was discovered in the model, namely the intermittent chaotic spiking and plateau bursting. The proposed design can be efficiently applied to developing the interfaces between circuitry and spiking neural networks. However, it should be noted that the proposed neuron design shares the main limitation of all the superconductor-based technologies, i.e., the need for a cryogenic and shielding system. Full article
Show Figures

Figure 1

37 pages, 548 KiB  
Review
Survey of Optimization Algorithms in Modern Neural Networks
by Ruslan Abdulkadirov, Pavel Lyakhov and Nikolay Nagornov
Mathematics 2023, 11(11), 2466; https://doi.org/10.3390/math11112466 - 26 May 2023
Cited by 64 | Viewed by 21366
Abstract
The main goal of machine learning is the creation of self-learning algorithms in many areas of human activity. It allows a replacement of a person with artificial intelligence in seeking to expand production. The theory of artificial neural networks, which have already replaced [...] Read more.
The main goal of machine learning is the creation of self-learning algorithms in many areas of human activity. It allows a replacement of a person with artificial intelligence in seeking to expand production. The theory of artificial neural networks, which have already replaced humans in many problems, remains the most well-utilized branch of machine learning. Thus, one must select appropriate neural network architectures, data processing, and advanced applied mathematics tools. A common challenge for these networks is achieving the highest accuracy in a short time. This problem is solved by modifying networks and improving data pre-processing, where accuracy increases along with training time. Bt using optimization methods, one can improve the accuracy without increasing the time. In this review, we consider all existing optimization algorithms that meet in neural networks. We present modifications of optimization algorithms of the first, second, and information-geometric order, which are related to information geometry for Fisher–Rao and Bregman metrics. These optimizers have significantly influenced the development of neural networks through geometric and probabilistic tools. We present applications of all the given optimization algorithms, considering the types of neural networks. After that, we show ways to develop optimization algorithms in further research using modern neural networks. Fractional order, bilevel, and gradient-free optimizers can replace classical gradient-based optimizers. Such approaches are induced in graph, spiking, complex-valued, quantum, and wavelet neural networks. Besides pattern recognition, time series prediction, and object detection, there are many other applications in machine learning: quantum computations, partial differential, and integrodifferential equations, and stochastic processes. Full article
(This article belongs to the Special Issue Mathematical Foundations of Deep Neural Networks)
Show Figures

Figure 1

13 pages, 2372 KiB  
Article
Accelerating Extreme Search of Multidimensional Functions Based on Natural Gradient Descent with Dirichlet Distributions
by Ruslan Abdulkadirov, Pavel Lyakhov and Nikolay Nagornov
Mathematics 2022, 10(19), 3556; https://doi.org/10.3390/math10193556 - 29 Sep 2022
Cited by 6 | Viewed by 2379
Abstract
The high accuracy attainment, using less complex architectures of neural networks, remains one of the most important problems in machine learning. In many studies, increasing the quality of recognition and prediction is obtained by extending neural networks with usual or special neurons, which [...] Read more.
The high accuracy attainment, using less complex architectures of neural networks, remains one of the most important problems in machine learning. In many studies, increasing the quality of recognition and prediction is obtained by extending neural networks with usual or special neurons, which significantly increases the time of training. However, engaging an optimization algorithm, which gives us a value of the loss function in the neighborhood of global minimum, can reduce the number of layers and epochs. In this work, we explore the extreme searching of multidimensional functions by proposed natural gradient descent based on Dirichlet and generalized Dirichlet distributions. The natural gradient is based on describing a multidimensional surface with probability distributions, which allows us to reduce the change in the accuracy of gradient and step size. The proposed algorithm is equipped with step-size adaptation, which allows it to obtain higher accuracy, taking a small number of iterations in the process of minimization, compared with the usual gradient descent and adaptive moment estimate. We provide experiments on test functions in four- and three-dimensional spaces, where natural gradient descent proves its ability to converge in the neighborhood of global minimum. Such an approach can find its application in minimizing the loss function in various types of neural networks, such as convolution, recurrent, spiking and quantum networks. Full article
(This article belongs to the Special Issue Mathematical Modeling, Optimization and Machine Learning)
Show Figures

Figure 1

14 pages, 32281 KiB  
Article
Cytoskeletal Filaments Deep Inside a Neuron Are not Silent: They Regulate the Precise Timing of Nerve Spikes Using a Pair of Vortices
by Pushpendra Singh, Pathik Sahoo, Komal Saxena, Jhimli Sarkar Manna, Kanad Ray, Subrata Ghosh and Anirban Bandyopadhyay
Symmetry 2021, 13(5), 821; https://doi.org/10.3390/sym13050821 - 7 May 2021
Cited by 23 | Viewed by 8915
Abstract
Hodgkin and Huxley showed that even if the filaments are dissolved, a neuron’s membrane alone can generate and transmit the nerve spike. Regulating the time gap between spikes is the brain’s cognitive key. However, the time modula-tion mechanism is still a mystery. By [...] Read more.
Hodgkin and Huxley showed that even if the filaments are dissolved, a neuron’s membrane alone can generate and transmit the nerve spike. Regulating the time gap between spikes is the brain’s cognitive key. However, the time modula-tion mechanism is still a mystery. By inserting a coaxial probe deep inside a neuron, we have re-peatedly shown that the filaments transmit electromagnetic signals ~200 μs before an ionic nerve spike sets in. To understand its origin, here, we mapped the electromagnetic vortex produced by a filamentary bundle deep inside a neuron, regulating the nerve spike’s electrical-ionic vortex. We used monochromatic polarized light to measure the transmitted signals beating from the internal components of a cultured neuron. A nerve spike is a 3D ring of the electric field encompassing the perimeter of a neural branch. Several such vortices flow sequentially to keep precise timing for the brain’s cognition. The filaments hold millisecond order time gaps between membrane spikes with microsecond order signaling of electromagnetic vortices. Dielectric resonance images revealed that ordered filaments inside neural branches instruct the ordered grid-like network of actin–beta-spectrin just below the membrane. That layer builds a pair of electric field vortices, which coherently activates all ion-channels in a circular area of the membrane lipid bilayer when a nerve spike propagates. When biomaterials vibrate resonantly with microwave and radio-wave, simultaneous quantum optics capture ultra-fast events in a non-demolition mode, revealing multiple correlated time-domain operations beyond the Hodgkin–Huxley paradigm. Neuron holograms pave the way to understanding the filamentary circuits of a neural network in addition to membrane circuits. Full article
(This article belongs to the Special Issue Quantum Information Applied in Neuroscience)
Show Figures

Figure 1

18 pages, 3216 KiB  
Article
Time-Multiplexed Spiking Convolutional Neural Network Based on VCSELs for Unsupervised Image Classification
by Menelaos Skontranis, George Sarantoglou, Stavros Deligiannidis, Adonis Bogris and Charis Mesaritakis
Appl. Sci. 2021, 11(4), 1383; https://doi.org/10.3390/app11041383 - 3 Feb 2021
Cited by 8 | Viewed by 3054
Abstract
In this work, we present numerical results concerning a multilayer “deep” photonic spiking convolutional neural network, arranged so as to tackle a 2D image classification task. The spiking neurons used are typical two-section quantum-well vertical-cavity surface-emitting lasers that exhibit isomorphic behavior to biological [...] Read more.
In this work, we present numerical results concerning a multilayer “deep” photonic spiking convolutional neural network, arranged so as to tackle a 2D image classification task. The spiking neurons used are typical two-section quantum-well vertical-cavity surface-emitting lasers that exhibit isomorphic behavior to biological neurons, such as integrate-and-fire excitability and timing encoding. The isomorphism of the proposed scheme to biological networks is extended by replicating the retina ganglion cell for contrast detection in the photonic domain and by utilizing unsupervised spike dependent plasticity as the main training technique. Finally, in this work we also investigate the possibility of exploiting the fast carrier dynamics of lasers so as to time-multiplex spatial information and reduce the number of physical neurons used in the convolutional layers by orders of magnitude. This last feature unlocks new possibilities, where neuron count and processing speed can be interchanged so as to meet the constraints of different applications. Full article
(This article belongs to the Special Issue Photonics for Optical Computing)
Show Figures

Figure 1

Back to TopTop