Next Article in Journal
Preparation of Non-Covalent BPTCD/g-C3N4 Heterojunction Photocatalysts and Photodegradation of Organic Dyes Under Solar Irradiation
Previous Article in Journal
Investigation of Thermoelectric Properties in Altermagnet RuO2
Previous Article in Special Issue
Electrolyte Gated Transistors for Brain Inspired Neuromorphic Computing and Perception Applications: A Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Memristor-Based Spiking Neuromorphic Systems Toward Brain-Inspired Perception and Computing

1
School of Physics and Electronic Engineering, Shanxi Key Laboratory of Wireless Communication and Detection, Shanxi University, Taiyuan 030006, China
2
Yongjiang Laboratory, Ningbo 315201, China
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Nanomaterials 2025, 15(14), 1130; https://doi.org/10.3390/nano15141130
Submission received: 29 May 2025 / Revised: 28 June 2025 / Accepted: 18 July 2025 / Published: 21 July 2025
(This article belongs to the Special Issue Neuromorphic Devices: Materials, Structures and Bionic Applications)

Abstract

Threshold-switching memristors (TSMs) are emerging as key enablers for hardware spiking neural networks, offering intrinsic spiking dynamics, sub-pJ energy consumption, and nanoscale footprints ideal for brain-inspired computing at the edge. This review provides a comprehensive examination of how TSMs emulate diverse spiking behaviors—including oscillatory, leaky integrate-and-fire (LIF), Hodgkin–Huxley (H-H), and stochastic dynamics—and how these features enable compact, energy-efficient neuromorphic systems. We analyze the physical switching mechanisms of redox and Mott-type TSMs, discuss their voltage-dependent dynamics, and assess their suitability for spike generation. We review memristor-based neuron circuits regarding architectures, materials, and key performance metrics. At the system level, we summarize bio-inspired neuromorphic platforms integrating TSM neurons with visual, tactile, thermal, and olfactory sensors, achieving real-time edge computation with high accuracy and low power. Finally, we critically examine key challenges—such as stochastic switching origins, device variability, and endurance limits—and propose future directions toward reconfigurable, robust, and scalable memristive neuromorphic architectures.

1. Introduction

Edge-AI workloads, such as 4 K/60 fps video (≈12 Gb/s) or sub-10 ms-latency drone navigation [1,2,3,4], expose fundamental von Neumann limits: memory-compute separation, data-locality loss, and steep energy scaling. Repeated crossings of the “memory wall” raise both latency and power to unsustainable levels [5,6,7,8,9]. With transistor scaling slowing, alternative paradigms are required. The human brain—operating complex cognition below 20 W—achieves low-energy throughput via massive parallelism, event-driven spikes, and in situ memory–compute fusion [10,11,12,13]. Neuromorphic engineering carries these biological insights to electronics. Yet the latest mixed-signal CMOS platforms—Intel Loihi 2, IBM TrueNorth—still dissipate 10–100 pJ spike−1 and require millimeter-scale silicon per 103 neurons, three orders of magnitude less dense than cortex [14,15,16]. Threshold-switching memristors (TSMs) overcome this gap by offering intrinsic sub-pJ spiking, <30 ns latency, and nanoscale footprints compatible with ≥1010 neurons cm−2 [17,18,19,20,21].
Their two-terminal structure further facilitates seamless integration into neuromorphic systems. Volatile TSMs underpin both (i) the device-level emulation of diverse Neuronal firing behaviors—regular, bursting, adaptive, and stochastic—and (ii) system-level sensory front-ends that transduce the light, pressure, or gas concentration directly into spike trains [22,23,24]. Particularly in SNN architectures, memristors can physically emulate key neuronal behaviors such as signal integration, threshold-triggered firing, and reset dynamics [25,26,27]. Their utility extends across both single-neuron behavior modeling and system-level sensory computing. Memristor-based neuron circuits can replicate diverse temporal firing behaviors observed in biological neurons—including regular spiking, bursting, and frequency adaptation—laying a foundation for energy-efficient, event-driven, and highly parallel neuromorphic systems. Compared to conventional artificial neural networks (ANNs) that rely on continuous signal transmission and extensive matrix operations, SNNs utilize discrete spikes for communication, enabling sparse, asynchronous processing that enhances robustness and adaptability in real-world sensory tasks while reducing energy consumption [28,29,30]. Moreover, the introduction of memristive elements into neuromorphic sensory frameworks has driven substantial progress in domains such as sensory processing, adaptive learning, and cognition. For instance, coupling memristors with multimodal sensors enables the emulation of biological sensory modalities such as vision, touch, and olfaction, expanding their role in neuromorphic perceptual systems. Such hybrid architectures can transduce external stimuli into temporal spike streams and execute near-sensor preprocessing, mimicking the behavior of sensory neurons like retinal or somatosensory cells. Memristors’ capacity to perform direct spike encoding from sensory inputs without extensive digital preprocessing renders them ideal for edge AI deployments, such as robotics, wearables, and bio-implantable interfaces.
Despite notable progress in various aspects, memristor-based neuromorphic systems still face several challenges for large-scale deployment, such as significant device variability, limited endurance, and stochastic switching behaviors, all of which may compromise the computational accuracy and scalability. Furthermore, efficient learning algorithms that account for the non-ideal characteristics of memristors are still under active development. Ongoing efforts in approximate computing, CMOS–memristor hybrid design, and the co-optimization of algorithms and hardware aim to overcome these obstacles and accelerate the transition toward practical applications.
In this review, we present a comprehensive assessment of memristors in enabling spiking neuromorphic systems targeting brain-like perception and computation. The discussion begins with the biological basis of neural spiking and common computational models, followed by the underlying switching physics of memristors and their utility in mimicking neuronal dynamics. We then summarize key developments in spiking neuron circuit designs based on memristors, highlight applications in vision, touch, and multimodal bioinspired sensing, and conclude by identifying major challenges and future perspectives.
While several prior reviews have examined memristors for neuromorphic computing, most focus predominantly on synaptic functionalities or general device mechanisms without emphasizing spiking neuron implementations. Furthermore, relatively few works systematically analyze how TSMs uniquely enable diverse spiking neuron behaviors—such as leaky integration, oscillations, stochastic firing, and action potential-like dynamics—at the device and circuit levels. Moreover, the interplay between memristive neuron models and multi-modal sensory systems (e.g., vision, touch, and olfaction) remains underexplored.
This review seeks to bridge these gaps by providing a focused, structured, and critical evaluation of memristor-based spiking neuromorphic systems, encompassing device mechanisms, circuit implementations, and bio-inspired perception applications. In doing so, it offers a unified perspective on how TSM-based neurons can be harnessed for energy-efficient, near-sensor computing, thereby advancing the state-of-the-art in neuromorphic engineering.

2. Biological Basis and Computational Models of Spiking Neurons

Biological neurons encode and transmit information through discrete electrical spikes, known as action potentials, rather than the clock-synchronized binary levels of digital logic [31]. This spike-based signaling is inherently sparse, asynchronous, and event-driven, underpinning higher-order brain functions such as perception, memory, and motor control. Anatomically, a neuron typically possesses thousands of dendritic spines, a soma with a membrane capacitance of approximately 200 pF, and a myelinated axon capable of propagating action potentials at velocities of 1–120 m/s [32,33]. As illustrated in Figure 1a, dendrites receive inputs from presynaptic neurons and can exhibit nonlinear phenomena—such as NMDA-mediated local spikes and calcium-induced calcium release [34,35]. The soma integrates these inputs and determines whether the membrane potential exceeds the firing threshold (Figure 1b). When this threshold is crossed, an action potential is generated at the axon hillock and propagates along the axon to the synaptic terminals, triggering neurotransmitter release and downstream signaling. Voltage-gated ion channels (Na+ and K+ types) coordinate depolarization and repolarization, shaping the spike waveform and timing [36,37,38,39].
Biological neurons display diverse spiking patterns to fulfill various functional roles in the brain. This functional diversity underpins the brain’s capacity for temporal encoding, information compression, and stimulus discrimination. Replicating these firing patterns in hardware implementations is essential for achieving biologically faithful neuromorphic architectures. Thus, a thorough understanding of the electrophysiological mechanisms underlying neuronal excitability and information coding is crucial for advancing neuroscience and designing future spike-based hardware systems.
The Hodgkin–Huxley (H-H) model provides a quantitative framework for describing the biophysical mechanisms underlying action potential generation in neurons [40]. It models the neuron as an equivalent electrical circuit composed of capacitive, resistive, and electromotive components, corresponding, respectively, to the lipid bilayer membrane, ion channels, and ionic reversal potentials. The membrane potential dynamics are governed by a set of nonlinear differential equations that capture the voltage- and time-dependent behavior of sodium (Na+) and potassium (K+) channels, along with a passive leak conductance. The primary equation is given by:
C m d V m d t = g N a m 3 h V m E N a + g K n 4 V m E K + g L V m E L + I e x t
where Cm is the membrane capacitance, Vm is the membrane potential, and gNa, gK, and gL denote the maximal conductances of Na+, K+, and leak channels, respectively. The variables m, h, and n represent gating variables that determine the probability of channel opening. This formalism enables the model to reproduce key neuronal phenomena including threshold firing, spike initiation, refractory periods, and subthreshold oscillations.
While the H-H model provides a highly detailed and biologically realistic simulation of neuronal behavior, its computational complexity is significant. Each time-step simulation typically requires around 1200 floating-point operations, making it computationally expensive and limiting its scalability for large networks or hardware implementations. This high computational cost presents challenges for real-time applications, particularly in neuromorphic systems where low power and efficiency are crucial. To reduce the computational cost while retaining biologically relevant timing, simplified neuron models have been proposed. Notably, the LIF model has become the predominant choice in theoretical neuroscience and neuromorphic implementations [41,42,43]. In the LIF model, the membrane potential integrates the input current over time and triggers a spike once a threshold is crossed, followed by a reset mechanism. While not as biologically detailed as the H-H model, the LIF model strikes a pragmatic balance between biological relevance and computational efficiency, making it particularly advantageous for designing large-scale SNNs [44]. Typical LIF implementations run in digital logic at <10 FLOPs/spike or in analog CMOS at <100 pJ/spike [45,46].
Beyond the LIF model, several other neuron models—such as the Izhikevich, FitzHugh–Nagumo (FHN), and Hindmarsh–Rose models—have been introduced to capture more complex neuronal dynamics. The two-variable Izhikevich model matches 20 neuronal firing patterns at 50× less computation than H-H, while FHN captures excitability class-I/II bifurcation with analytical tractability [47]. For example, the Izhikevich model employs a two-dimensional nonlinear differential equation framework, enabling it to reproduce diverse neuronal firing behaviors—tonic firing, phasic bursting, and rebound spiking—with minimal parameter tuning. The FHN and Hindmarsh–Rose models abstract neuronal excitability into low-dimensional dynamical systems, facilitating the exploration of dynamic phenomena such as bifurcation and chaos [47,48,49].
In conclusion, spiking neuron modeling serves as a critical bridge between biological neuroscience and AI system design. Ranging from the intricate ion–channel kinetics of the H-H model to the computationally efficient abstractions of the LIF and Izhikevich models, the field is evolving toward neuron models that balance biological fidelity with hardware implementability [50]. This foundation underpins both the understanding of cerebral function and the ongoing innovation in neuromorphic computing technologies.
Selecting appropriate spiking neuron models hinges on balancing biological realism and computational efficiency. Therefore, in neuromorphic engineering—where energy efficiency and scalability are paramount—identifying suitable hardware-level implementations of neuron models can significantly simplify computation. Memristors, with their inherent hardware compatibility and intrinsic nonlinearities that emulate ion–channel dynamics, emerge as promising candidates for implementing diverse spiking behaviors in low-power, compact hardware systems. When implemented with memristors, these simplified models benefit from the inherent characteristics of memristive devices, such as threshold-switching and nonlinear behavior, to replicate the action potential dynamics seen in biological neurons. The trade-offs between these models in terms of biological fidelity versus hardware implementation are critical for choosing the right model for a given application. For example, while the H-H model offers high fidelity, it is challenging to implement efficiently in low-power neuromorphic systems due to its high computational demand.

3. Memristor Fundamentals for Neuromorphic Applications

In 1962, negative differential resistance (NDR) was first observed in metal–oxide sandwich structures based on Al2O3, ZrO2, TaOX, and TiO2, laying the conceptual foundation for subsequent memristor research [51]. The memristor concept itself was formally introduced by Chua in 1971, who defined it as the fourth fundamental passive circuit element, complementing the resistor, capacitor, and inductor [52]. Owing to their intrinsic nonlinear switching dynamics, memristors have since drawn extensive interest within the neuromorphic engineering community as promising building blocks for bio-inspired neural systems. A typical memristor comprises a metal–insulator–metal (MIM) sandwich structure, in which the dielectric layer’s resistance modulates in response to the applied voltage or current. A variety of materials are used for the dielectric layer, including transition metal oxides (e.g., TiO2, HfO2, and TaOX), chalcogenides, and perovskite compounds. Depending on whether the conductance state is retained, memristors are categorized into nonvolatile and TSMs [53,54,55,56,57,58,59]. Nonvolatile memristors act as programmable synaptic weights, retaining conductance states without power [42,60,61,62,63], while volatile TSMs are essential components in neuromorphic systems, especially for emulating the spiking behavior of biological neurons.
TSMs used in neuron circuit design are mainly classified into two types based on their switching mechanisms: electrochemical redox and Mott memristors [60,61,64]. As shown in Figure 2a, redox memristors typically incorporate active electrodes (e.g., Ag or Cu) and oxide-based dielectrics (e.g., HfO2 or TaOX) and operate via electrochemical reactions and metal ion migration. When an external electric field is applied, metal atoms at the active electrode are oxidized and migrate across the dielectric, forming a metallic filament that connects both electrodes—a SET operation that switches the device from the high-resistance state (HRS) to the low-resistance state (LRS) [65], as depicted in Figure 2b. Once the bias is removed or sufficiently lowered, the conductive filament ruptures spontaneously to minimize the interfacial energy, thereby restoring the device to HRS and completing a reversible switching cycle. Experimental studies show that these devices typically exhibit switching times in the order of tens of nanoseconds; the HRS current can also drop as low as 1 pA [65,66,67,68]. A compliance current (CC) is typically applied during the testing of redox-type memristors to prevent permanent thermal damage to the conductive filament, which would otherwise hinder spontaneous rupture. These devices benefit from structural simplicity, mature process compatibility, low standby power, and scalability, but their switching behavior can be highly variable due to stochastic filament growth, leading to cycle-to-cycle resistance scatter, which presents challenges for reliable neuromorphic hardware implementations [69,70,71,72,73,74].
Mott memristors, depicted in Figure 2c, based on correlated electron materials like VO2 and NbOX, exhibit reversible insulator-to-metal transitions (IMT) driven by temperature or electric field changes [65,75]. As shown in Figure 2d, the I–V curve of a Mott-type TSM demonstrates the characteristic transition from insulating (HRS) to metallic (LRS) states when the applied voltage exceeds a threshold [63]. Mott-based TSMs exhibit fast, stable switching with a characteristic switching time of approximately a few nanoseconds. These devices are capable of replicating rapid spiking behaviors observed in biological neurons. However, the high instantaneous current required for switching the mA range contributes to a higher power consumption. This makes Mott-TSMs suitable for high-speed neural circuits, but presents challenges for low-power neuromorphic systems where energy efficiency is critical [76].
Researchers have implemented a variety of TSM-driven neuron circuits to emulate diverse spiking patterns, including oscillatory [72], LIF, stochastic [42,62,66,77,78], and frequency-adaptive firing behaviors [63]. The most common is the LIF type, where a TSM is connected in parallel with a capacitor to accumulate charge and trigger spiking upon reaching a threshold, followed by a rapid reset. For instance, LIF neurons with tunable firing frequencies and refractory periods have been realized using materials such as SiO2, TaOX, and NbOX. Oscillatory neurons utilize the NDR region of TSMs coupled with passive RC networks to generate periodic spiking under constant bias, suitable for simulating rhythmic neuronal activities. Moreover, researchers have exploited intrinsic thermal and conductance fluctuations in devices to build stochastic neurons that emulate probabilistic firing for applications such as Boltzmann machines. The Hodgkin–Huxley (H-H) neuron model, representing detailed ion channel dynamics, can also be implemented using arrays of TSMs to mimic Na+/K+ transport phenomena. Pickett et al. utilized two NbOX TSMs to replicate the dynamic currents of Na+ and K+, producing multi-phase spiking and membrane potential oscillations. Subsequent studies introduced programmable memristors like ECRAM combined with TSMs to build adaptive neuron circuits with tunable spiking dynamics, extending the functional emulation of biological neurons. Overall, neuron circuits built upon TSMs enable biologically faithful, energy-efficient, and miniaturized hardware architectures, showing strong potential as core elements in future neuromorphic processors and brain-like perception platforms.

4. Memristor-Based Spike Neuron Implementation Scheme

A key goal of neuromorphic engineering is to design compact, energy-efficient electronic elements that accurately emulate the spiking dynamics of biological neurons. Among emerging devices, TSMs provide a distinctive approach to emulate spiking neuron functionalities. Leveraging their intrinsic nonlinear dynamics, spontaneous threshold switching, and time-dependent behavior, researchers have constructed numerous artificial spiking neurons. This section presents a taxonomy of key neuron circuit architectures, implementation strategies, and performance metrics.

4.1. Oscillatory Neurons

Oscillatory neurons emulate the rhythmic firing behavior observed in biological neurons by generating periodic spikes when driven by either constant voltage or current inputs. This neuron type is particularly suitable for simulating synchronized and rhythmic neuronal activity, commonly found in neuronal oscillations.
In a current-driven oscillatory neuron circuit, as illustrated in Figure 3a, a TSM is connected in parallel with a capacitor. When the memristor remains in an HRS, the injected current progressively charges the capacitor, gradually elevating the output voltage (VOUT). Upon reaching the threshold voltage (VTH), the memristor abruptly transitions to an LRS, causing the rapid discharge of the capacitor and a sharp voltage drop. As the voltage drops below the hold voltage (VHOLD), the memristor returns to HRS, initiating another charging–discharging cycle and thus producing repetitive spiking. The spike frequency linearly increases with the input current intensity, providing tunable oscillation dynamics analogous to biological neurons. In contrast, the voltage-driven oscillatory neuron circuit (Figure 3b) comprises a memristor, resistor, and capacitor. With a constant voltage applied, the capacitor gradually accumulates charge through the resistor. Once the voltage across the memristor surpasses its threshold, the memristor transitions from HRS to LRS, rapidly discharging the capacitor and causing a sharp voltage drop. After discharging, the voltage falls below VHOLD, resetting the memristor to HRS, and the charging cycle repeats. This mechanism also generates rhythmic spiking, with a frequency proportional to the applied voltage, enabling precise control of the spiking patterns.
These memristor-based oscillatory neurons have been extensively applied in neuromorphic computing tasks. Gao et al. demonstrated a self-oscillating neuron based on Pt/NbOX/Pt devices with a tunable frequency [80], while Shi et al. further simplified the circuit by utilizing intrinsic parasitic capacitances, significantly reducing the complexity and footprint of oscillatory neuron circuits [67]. Oscillatory neuron models based on TSMs, offer intrinsic spiking behavior driven by volatile resistance changes, eliminating the need for external timing circuits. This makes them highly attractive for compact, energy-efficient neuromorphic systems. However, several limitations remain. First, the high-temperature operation required for Mott-transition devices like VO2 raises concerns for system-level thermal management and CMOS compatibility. Second, oscillatory neurons often exhibit poor controllability in the spiking phase, making them less suitable for tasks requiring precise spike timing, such as temporal coding or coincidence detection. Additionally, frequency encoding is inherently analog and can suffer from noise sensitivity, limiting robustness under fluctuating inputs. Some designs also rely on external passive components (e.g., capacitors or resistors) to tune the oscillation behavior, which increases the area and integration complexity.

4.2. Leaky Integrate-and-Fire (LIF) Neurons

Beyond periodic spiking, the neural firing mechanism follows a principle of temporal integration: if multiple synaptic inputs collectively exceed the threshold within a certain time window, a spike is fired; otherwise, the integrated input gradually attenuates. This behavior is typically modeled by the LIF neuron, which captures both integration and leakage processes. The “leaky” membrane potential of neurons is analogous to the conductivity decay in volatile memristors, which reflects a key dynamic aspect of memory decay. The volatility of memristors allows artificial neurons to autonomously return to the resting potential after firing, or revert to a quiescent state if the input is insufficient to trigger firing. As shown in Figure 3c, the TSM-based LIF neuron circuit consists of a TSM in series with a resistor R0 and in parallel with a capacitor C. Input pulses are applied through resistor RS, and the voltage across R0 is monitored as the spike output. When a series of voltage pulses is applied, the capacitor charges, causing a gradual increase in its voltage—this process is termed “integration”. When the TSM voltage exceeds VTH, it transitions from a high-resistance to a low-resistance state. Since the off-state resistance of the TSM is much larger than R0, the voltage drop across R0 remains near zero during charging. Upon threshold activation, the TSM rapidly drops to low resistance, causing a voltage spike across R0 until the device reverts to the off-state as the voltage falls. Since the discharge loop has a lower resistance, a sharp voltage spike is observed across R0, denoting the “firing” event (Figure 3d). The spiking rate grows with stronger inputs and diminishes with increasing RC values in the LIF circuit.
Zhang et al. developed a LIF neuron based on Ag/SiO2/Au TSMs, demonstrating adjustable firing rates and refractory periods [66]. Building upon these foundational neuron circuits, subsequent research expanded their application into more comprehensive neuromorphic architectures. For instance, Wang et al. further proposed a fully memristive artificial neural network integrating threshold-switching neurons with non-volatile memristor synapses, successfully demonstrating unsupervised learning and pattern recognition capabilities [42]. Such neuron–synapse integrated architectures underscore the feasibility of memristor-based neurons in practical neuromorphic computing tasks, highlighting their potential for scalable brain-inspired computational systems. Yuan et al. employed VO2-based memristors to construct asynchronous spike encoders that transform multichannel biosignals, such as EEG and ECG, into sparse spiking sequences, thereby reducing the data size and power demand [81]. Furthermore, memristor-based LIF and ALIF neurons implemented using VO2 devices were incorporated into a long short-term memory spiking neural network (LSNN) to enable efficient spike-coded signal processing. The system achieved remarkable classification accuracy—95.83% for arrhythmia and 99.79% for epilepsy detection—validating the capability of VO2 memristors in enabling scalable and efficient neuromorphic platforms for biomedical signal analysis. Compared to CMOS LIF neurons, TSM-based implementations offer significantly reduced energy per spike, as well as area advantages due to simpler circuitry. However, current designs often suffer from limited linearity and the poor control of leakage currents, which can reduce the spike-timing precision. Further studies should focus on stabilizing integration behavior through selector-assisted topologies or hybrid analog-digital feedback loops.

4.3. Hodgkin–Huxley (H-H) Neurons

H-H neurons represent one of the most biologically realistic models in computational neuroscience, capable of reproducing a wide range of electrophysiological behaviors such as action potentials, thresholding, and refractory dynamics. Implementing H-H-type neurons with memristive elements requires fine-grained control over ionic conductance, which can be achieved through TSMs that emulate the gating dynamics of voltage-dependent ion channels. A pioneering example was demonstrated by Pickett et al., who utilized Mott-type memristors to mimic the dynamic behaviors of Na+ and K+ channels [64]. Their circuit incorporated two TSMs with opposing polarities, regulated by separate DC voltage sources, to replicate the bidirectional flow of ionic currents—analogous to the depolarization and repolarization processes in biological membranes (Figure 3e). The resulting waveform reproduced key features of biological action potentials, including depolarization, hyperpolarization, and refractory periods (Figure 3f). This work marked a significant step toward the physically accurate emulation of spiking dynamics in hardware.
However, while the implementation achieves high fidelity in replicating the H-H model, it also exhibits substantial complexity and overhead. The requirement for multiple discrete components—including separate voltage sources, resistors, and capacitors—limits its scalability and integrability in dense neuromorphic circuits. Additionally, the relatively high power consumption and limited dynamic range of Mott memristors present challenges for practical edge-AI applications that demand energy efficiency and robustness under variability. From a computational perspective, the detailed biophysics modeled by H-H neurons are often overkill for many real-world tasks, where simpler models such as LIF or adaptive spiking suffice with significantly less hardware complexity. Nevertheless, H-H-type implementations are valuable for benchmarks, biological validation, and exploratory modeling, especially when studying complex ion–channel interactions, pharmacological responses, or neuron-level pathologies.
Looking forward, future H-H-inspired implementations may benefit from compact, hybrid architectures that integrate memristive dynamics with programmable analog/digital modules. This approach could preserve key nonlinear behaviors while reducing the circuit size and improving the power efficiency. Additionally, material engineering in Mott systems—for example, by controlling phase transition thresholds or improving the thermal stability—could further enhance the reproducibility and endurance of these neurons in large-scale neuromorphic platforms.

4.4. Stochastic Neurons and Origins of Randomness in Memristors

The stochastic behavior of memristors is one of the intriguing characteristics that can play a significant role in neuromorphic computing systems, particularly in randomness-based applications such as true random number generation (TRNG). Stochastic switching in memristors primarily originates from atomic migration, thermal noise, and electron correlation effects, varying by the material type. In redox-based memristors, the movement of metal ions under an applied electric field is a key factor in generating stochasticity. This migration leads to the formation of conductive filaments in the dielectric material, and the random nature of this filament growth contributes to variability in the device’s resistance. As the SET and RESET operations occur, the resistance states can fluctuate due to the inherent randomness in atomic migration. In Mott memristors, stochastic switching originates from critical electron–electron interactions and phase transition dynamics, which are highly sensitive to local temperature and electric field gradients. These phenomena result in observable cycle-to-cycle and device-to-device variations in the switching delay, threshold voltage, and retention characteristics.
The inherent stochasticity in memristive devices, particularly in TSMs, offers a valuable mechanism to emulate the probabilistic behavior of biological neurons. Unlike deterministic neurons, which fire in response to precise input conditions, biological neurons often exhibit variable firing thresholds and spike timing due to thermal fluctuations and ion channel noise. Emulating such stochasticity is critical for realizing probabilistic computation, Bayesian inference, and uncertainty quantification in neuromorphic systems.
This stochasticity, while traditionally viewed as a source of unreliability, can be purposefully exploited. For instance, Miao et al. developed an electronic stochastic neuron using a CuS/GeSe-based TSM, capable of mimicking random firing behavior observed in biological neurons and enabling Bayesian inference within a spiking neural network (SNN) [82]. The stochastic neurons markedly decreased the fatal misjudgment rate in tumor classification scenarios, a typical shortcoming in traditional ANN models. Compared with SNNs built from deterministic neurons, this stochastic neuron approach improved the uncertainty estimation accuracy, boosting the confidence evaluation by 81.2%. This study highlights the potential of integrating stochastic memristors into neuromorphic systems, paving the way for efficient hardware-based probabilistic computing. Mao et al. reported a stacked IGZO-based TSM neuron with the sigmoid firing probability, mimicking the probabilistic response characteristics of natural neurons [83]. This stacked TSM architecture exhibited reduced relative switching variability (6.8%) compared to single-device implementations (59%), critical for deep Boltzmann machines where the weight noise severely degrades the log-likelihood. Moreover, the IGZO stochastic neuron was applied in probabilistic unsupervised learning for handwritten digit reconstruction using a restricted Boltzmann machine, achieving a 91.2% recognition accuracy. This IGZO-based stochastic neuron, with its repeatable probabilistic firing, emulates brain-like probabilistic computation, offering significant implications for hardware SNNs in sensory processing, motor control, and reasoning.
Beyond probabilistic neural modeling, the stochastic behavior of TSMs is also suitable for hardware-based TRNGs. Phase-change neurons, as demonstrated by Tuma et al., leverage intrinsic thermal noise and threshold dispersion for robust random spike generation, offering potential for secure neuromorphic encryption and stochastic optimization [78]. These developments highlight the duality of memristor stochasticity—as both a physical constraint and a computational asset.
These studies demonstrate the promise of stochastic memristive neurons in probabilistic inference and unsupervised learning tasks. Nonetheless, these models face notable limitations. The reliance on physical noise sources introduces significant variability not only across devices, but also across time in the same device, making consistent spike-rate modulation and task-specific calibration challenging. Moreover, stochasticity in spike generation, while useful for probabilistic inference, can undermine the performance in applications requiring deterministic or time-locked responses, such as sequence learning or temporal pattern recognition. Additionally, the mapping from input stimuli to desired firing probabilities often lacks linearity or interpretability, complicating training and co-optimization with learning algorithms.
Compared to LIF and oscillatory neurons, stochastic memristive neurons provide a valuable mechanism for modeling neural noise and probabilistic computation, but their application scope is more suited to ensemble or redundancy-based networks rather than precision-centric tasks. Future work may benefit from integrating tunable stochasticity (e.g., a bias-controlled noise injection) or leveraging noise-aware learning frameworks such as contrastive divergence or spike-based variational inference to enhance the performance and system stability.
Looking forward, engineering stochasticity at the device level through the material design, geometry optimization, and temperature control could enable tunable randomness, paving the way for neuromorphic hardware platforms with adaptive uncertainty handling. Moreover, hybrid architectures combining deterministic and stochastic neurons may allow for biologically inspired trade-offs between precision and flexibility in spiking neural networks. In summary, the stochastic properties of memristors—once considered detrimental—are increasingly being harnessed to emulate the noisy dynamics of biological neurons. This opens new avenues for energy-efficient probabilistic computing and robust neuromorphic perception systems, especially in real-world environments where uncertainty and variability are inherent.

4.5. Complex Spiking and Reconfigurable Neurons

Beyond the fundamental neuron types, complex firing patterns including burst firing, frequency adaptation, and neuronal inhibition are also critical for enabling energy-efficient neural computation. Yi et al. demonstrated over twenty spiking behaviors using VO2 memristors, such as bursting, frequency adaptation, and subthreshold oscillations, as well as characterized capacitor-dependent mechanisms and stochastic phase-locked firing, highlighting the dynamic richness and computational complexity of biological neurons [63]. Kim et al. proposed an artificial neuron architecture based on NbOX memristors, utilizing metal–insulator phase transitions and thermodynamic effects in flexible organic substrates to replicate 18 biological firing patterns, including frequency adaptation and subthreshold oscillations [84]. By leveraging thermal coupling, the system achieves neuron-to-neuron communication through the heat flow alone, bypassing traditional interconnections and greatly reducing the energy consumption. The experimental results showed that this architecture achieved over 106-fold energy savings compared to conventional digital processors in graph optimization tasks, demonstrating the great potential of thermodynamics in neuromorphic computing. Yu et al. proposed a memristor-based reconfigurable spiking neuron architecture capable of emulating multiple firing patterns at the hardware level, including fast spiking, adaptive spiking, phase spiking, and bursting. [85] The neuron design integrates NbOX-based spike units with electrochemical memristors (ECRAMs), enabling the programmable switching of firing behaviors by adjusting the ECRAM resistance, thus achieving function-level reconfigurability without altering the circuit structure. Such a resistance-coded polymorphism paves the way for compiler-directed neuron “microcodes”, enabling run-time reconfiguration analogous to micro-op fusion in CPUs.
While a range of memristor-based neuron implementations—from redox and Mott devices to TaOX, NbOX, and VO2 technologies—have been demonstrated, their power consumption, spiking frequency, endurance, and switching-energy characteristics vary widely. Table 1 summarizes these key experimental metrics—the power consumption, switching energy, operating frequency range, spiking rate, and endurance—which are critical for assessing the feasibility of large-scale, energy-efficient neuromorphic systems.
In summary, TSMs provide a solid hardware foundation for building compact, low-power, and biologically plausible spiking neurons due to their pronounced nonlinear switching properties. By integrating TSMs with passive elements such as capacitors and resistors, various spiking behaviors—including oscillatory, LIF, stochastic, and H-H types—have been successfully replicated, with support for reconfigurable neuron implementations. These efforts not only greatly reduce the circuit complexity, but also demonstrate excellent energy efficiency and integration potential, advancing neuromorphic computing from basic units to multifunctional and scalable systems. Memristors offer unparalleled advantages in hardware-level spike encoding, probabilistic reasoning, and multimodal neural dynamics, laying the groundwork for next-generation intelligent and adaptive brain-like systems.

5. Memristor-Based Spiking Neuromorphic Perception

Humans rely on photoreceptors, thermoreceptors, mechanoreceptors, and olfactory receptors to receive physical signals from the external environment, which are encoded into spikes and transmitted to neural networks for perception and learning [88,89]. Such systems exhibit exceptional capabilities in extracting and encoding environmental cues, allowing energy-efficient, adaptive, and resilient cognitive functions [90,91]. Inspired by this, memristor-based neuromorphic perception systems integrate spiking neurons with sensory components to achieve direct hardware-level encoding and preprocessing, building compact, low-power platforms with real-time and edge computing capabilities [92,93]. In this section, we focus on recent developments and bio-inspired applications of spiking perception systems based on memristive neurons.

5.1. Spiking Visual Perception Systems

With the rapid development of intelligent robotics, autonomous driving, and wearable electronics, the demand for artificial vision systems with high efficiency, low-power consumption, and real-time responsiveness is growing rapidly [94,95,96,97]. Biological visual systems exhibit exceptional efficiency and adaptability, largely attributable to their massively parallel architectures, event-driven signaling mechanisms, and sparse neural encoding strategies. In particular, retinal networks perform robust front-end preprocessing by filtering out irrelevant visual noise and extracting salient features in real time. This capability substantially reduces the computational burden on downstream neural circuits, enhancing the overall energy efficiency and response speed, thus providing a compelling model for bio-inspired visual perception systems [86,98,99,100]. Consequently, the development of biologically analogous artificial vision systems, especially those that enable direct optical-to-spike conversion and local signal preprocessing in hardware, is of great significance for building cognitive brain-inspired intelligent systems and constitutes a critical direction for achieving next-generation high-efficiency intelligent perception.
In the biological visual system, retinal photoreceptors detect and integrate optical signals and transmit them to the brain’s visual cortex. Figure 4a illustrates an artificial visual sensory neuron developed by Wu et al., consisting of an InGaZnO4 (IGZO4) ultraviolet (UV) photoreceptor connected in series with a NbOX-based TSM, designed to emulate functionalities of the human visual system [86]. When a beam of UV light is vertically incident on the photoreceptor, the UV stimuli of different wavelengths effectively modulate the resistance state of the IGZO4 photodetector. Consequently, the voltage across the series-connected NbOX spiking neuron also varies, leading to the generation of spike signals at different frequencies, as shown in Figure 4b. This artificial visual neuron thus exhibits light-modulated neuronal behavior, with its spiking rate dependent on the wavelength of the incident UV light. The contraction and relaxation of ciliary muscles enable the eye to focus on objects at varying distances. A TaOX memristor-based photoelectric spiking neuron system has recently been developed to replicate the depth perception function of biological vision [79]. By integrating a photoresistor with a spiking neuron circuit, the system encodes optical stimuli from different distances into distinct spike frequency sequences, thereby realizing depth sensing and encoding. A binocular vision system was further constructed using this neuron, where spike frequency differences from left and right inputs were processed by an artificial neural network to achieve accurate spatial localization with a recognition accuracy of 90%. This study demonstrates the hardware-level emulation of biological depth perception and provides a practical approach to implementing stereo vision in neuromorphic systems. Wang et al. vertically integrated TaOX memristors with IGZO photodetector layers to develop a spiking cone photoreceptor array (VISCP) (Figure 4c) featuring ultra-low power consumption (≤400 pW), a compact structure, and color-selective artificial visual sensing [73]. VISCP utilizes wavelength-dependent conductance modulation to encode various colors within the visible spectrum into spike frequencies, yielding more than 1.5 orders of magnitude in the frequency distinction (Figure 4d). When integrated into a convolutional spiking neural network (SNN), the system demonstrated a color classification accuracy of 83.2% under variable lighting conditions. These results highlight the VISCP’s potential for both biomimetic vision and intelligent sensing tasks.
Furthermore, Li et al. integrated synaptic phototransistors (BPR PT) with NbOX memristors to construct a spiking neuron array (Figure 4e) with dual-opponent receptive fields and directional selectivity, enabling the first device-level encoding of combined spatial-color information [101]. Figure 4f illustrates how shifting the load line by tuning VGS relative to the two switching thresholds of the TSM enables excitatory or inhibitory spike outputs under NIR or UV illumination. Emulating V1 cortical functionality, the system performs spike responses to color edges using color opponency and directional filters, enhancing the recognition of directional and chromatic variations, and improving SNN reliability in complex scenes under low illumination. Compared to conventional color-sensing approaches, this system significantly boosts front-end feature extraction and information compression, highlighting the promise of neuromorphic vision in edge-aware perception and scene understanding. The research marks a step forward for neuromorphic vision, transitioning from basic color recognition to a deeper comprehension of image structures.

5.2. Spiking Tactile Perception Systems

In biological systems, tactile neurons like cutaneous mechanoreceptors transduce external pressure into spike signals, which are relayed to the somatosensory cortex for further processing to discern object properties and avoid potentially harmful stimuli [102,103]. Zhang et al. developed an artificial spiking afferent nerve (ASAN) architecture using NbOX-based Mott memristors to mimic the sensory function of transforming analog stimuli into spike trains [104]. The system employs the negative differential resistance characteristic of NbOX memristors to construct a compact oscillator circuit capable of generating spike signals with stimulus-dependent frequencies. Under low-to-moderate stimulation, the spike frequency exhibits a near-linear dependence on the stimulus intensity; however, at high intensities, the system displays protective suppression with reduced spiking, mimicking neuronal self-protection. Additionally, the ASAN system was integrated with passive piezoelectric sensors to form a self-powered spiking mechanoreceptor system (Figure 5a) capable of converting mechanical pressure into spikes (Figure 5b,c), highlighting its promise for neurorobotics and brain-inspired sensing. Li et al. constructed an artificial mechanoreceptor system with skin-like characteristics, enabling sustained pressure sensing akin to slow-adapting biological receptors and the enhanced fusion of dual tactile inputs through memristive neuron integration [105]. Figure 5d illustrates the biological mechanism of tactile integration, alongside its artificial counterpart implemented through memristor-based mechanoreceptors. As shown in Figure 5e, when two pressure stimuli are applied simultaneously, the spike frequency increases significantly. The spike frequency after the spatial integration of the parallel sensors is higher than that induced by unilateral stimulation, indicating a notably shortened latency and faster tactile perception. By applying pulse-coupled neural networks (PCNN), this architecture effectively encodes spike frequencies, improves the recognition accuracy, and shows potential in the spatial integration of tactile inputs for intelligent neural interfaces using memristors.
Yang et al. introduced a neuromorphic tactile system based on coupled VO2-based memristive oscillators [106]. The schematic illustration of two capacitively coupled oscillatory neurons is shown in Figure 5f. The spiking outputs exhibit clear synchronization patterns, as illustrated in Figure 5g, highlighting their ability to mimic complex tactile processing. The relationship between the phase difference and resistance under various coupling capacitances, as demonstrated in Figure 5h, provides insights into achieving the precise control of neuron synchronization states. Furthermore, this neuromorphic tactile system was successfully implemented in gesture recognition applications, as shown in Figure 5i, achieving high accuracy and low power consumption, which exemplifies its practical relevance for advanced sensory-processing applications in robotics and wearable systems. In addition, a continuous-time dynamical system was implemented for sensory preprocessing, revealing superior metrics in the energy efficiency, area, and latency, which pushes the boundaries of compact nonlinear neuromorphic circuits.

5.3. Spiking Thermosensory and Olfactory Perception Systems

In the somatosensory system, thermoreceptors play a critical role in regulating metabolic processes and preventing damage from harmful external stimuli [107,108]. Thermoreceptors encode thermal stimuli into spike trains that are subsequently interpreted by the nervous system as heat sensation. The realization of artificial spiking thermoreceptors (ASTs) may enable new directions in low-power biomimetic thermal sensing. Shi et al. developed an AST based on an Ag/TaOX/AlOX/ITO memristor, leveraging the temperature-dependent diffusion kinetics of silver ions in the oxide to emulate biological thermal sensing and encoding [67]. Without requiring extra ADC hardware, the AST converts thermal inputs into spike trains with specific frequencies, consuming less than 240 nW. An artificial thermal perception system was constructed using the AST and a pulse-coupled neural network (PCNN), demonstrating effective thermal image edge detection. An ultra-stable NDR memristor using AlAs/In0.8Ga0.2As/AlAs quantum wells was introduced by Pei et al., offering improved reliability and compatibility for neuromorphic integration [109]. It maintains stable operation at elevated temperatures (up to 400 °C), supporting long-term reliability in extreme environments. Using this device, the researchers implemented an ultra-compact FitzHugh-Nagumo (FN) neuron circuit without external capacitors, successfully replicating nine typical firing behaviors including phase spiking, spike-frequency adaptation, and subthreshold oscillations, significantly simplifying the circuit design and reducing hardware costs. Additionally, the memristor-based SNN showed a 91.74% classification accuracy in the multimodal voltage–temperature perception, validating its potential for robust computation in harsh environments. This work establishes a reliable device platform and feasible approach toward highly energy-efficient and scalable neuromorphic chips. Wang and co-workers developed a neuromorphic platform that mimics real-time biological olfaction by integrating gas sensing, memory, and processing functions [110]. A volatile Pt/Ag/TaOX/Pt memristor serves as a LIF neuron that transforms gas-sensor inputs into spiking signals. The output spikes are further transmitted through non-volatile Pt/Ta/TaOX/Pt-based memristive synapses to downstream relay neurons for signal processing and pattern recognition. By leveraging frequency-sensitive synaptic plasticity, the system enables accurate gas recognition and categorization.

5.4. Spiking Multimodal Perception Systems

Building upon the previously discussed unimodal perception systems—covering vision, touch, thermal, and olfactory domains—recent research has emphasized the integration of multiple sensory modalities. In biological somatosensory systems, multimodal integration allows the comprehensive perception of object attributes, enabling precise decision making [88,111,112,113]. Inspired by this, Yuan et al. developed a highly efficient spiking neuromorphic hardware system capable of sensing and encoding various physical stimuli [114]. By integrating VO2 TSMs with multiple sensor modalities, the system effectively converts physical signals—such as illumination, temperature, pressure, and curvature—into spike trains. When these spiking outputs were fed into a three-layer spiking neural network, an accuracy of 90.33% was achieved in MNIST-based pressure image classification tasks. Additionally, the neuromorphic sensing modules demonstrated a capability in monitoring the finger curvature for gesture classification, underscoring their significant potential in advanced multimodal neuro-robotic applications. Zhu et al. proposed a heterogeneously integrated multimodal fusion spiking neuron (MFSN) array (Figure 6b), aimed at human-like multisensory perception and object classification [115]. The system converts various sensory inputs like pressure and temperature into spike frequency encoding and performs fusion via memristor-based neuron arrays, effectively mimicking multisensory integration in biological systems. The simulation results indicated that integrating multiple sensory modalities led to a higher recognition rate (93%) in distinguishing cup features compared to using individual modes, with the pressure-only mode achieving 67% and temperature-only mode achieving 72.5%, as shown in Figure 6c. These results demonstrate the feasibility of the MFSN system for advanced robotic intelligence, offering both a high recognition accuracy and energy efficiency.
Li et al. introduced a flexible biomimetic cross-modal spiking neuron (CSSN) based on VO2 memristors, capable of the real-time sensing, encoding, and processing of multimodal signals like the pressure and temperature at the hardware level (Figure 6d) [87]. As shown in Figure 6e, the CSSN system is fabricated on a flexible substrate with integrated components including VO2-based memristors, a control module, and a Wi-Fi unit, enabling autonomous sensing and wireless feedback. The VO2 memristors employed exhibit an outstanding performance, including >1012 endurance cycles, 0.72% cycle-to-cycle variation, 3.73% device uniformity, an <30 ns response time, and bendability to a 1 mm radius. Figure 6f illustrates the multi-sensory response behavior of the CSSN. When subjected to combined pressure (7–18 kPa) and temperature stimuli (24–42 °C), the neuron circuit generates spike trains with distinct firing frequencies, demonstrating effective multimodal sensory encoding. A CSSN-based flexible processing system enables dynamic object recognition with 98.1% accuracy and real-time feedback, highlighting its great potential in wearable human–machine interfaces.
For motion sensing and control, Yang et al. proposed a spike-coding neural circuit driven by emission characteristics, integrating a LiDAR sensor with three H-H neurons based on NbOX memristors for real-time robotic obstacle avoidance [116]. The architecture mimics biologically inspired frequency-modulated distance coding, translating proximity cues into frequency-dependent spike sequences for feedforward decision making without central supervision. Architecturally, the use of memristors enabled compact neuron construction, and the frequency modulation strategy enhanced both reactivity and resilience to environmental changes. By coupling spike frequency patterns with environmental signal representation, this study pioneers a unified neuromorphic strategy for integrated perception and decision making in mobile robots, addressing challenges in energy-efficient autonomous navigation.
These systems embody the paradigm of “perception-as-computation”: encoding and processing occur directly at the signal source, reducing redundancy and latency, suitable for resource-constrained edge AI applications. Nevertheless, challenges remain in device uniformity, heterogeneous integration, and stability; future progress will rely on incorporating plasticity (e.g., STDP), vertical stacking, and 3D crossbar architectures for scalable integration and autonomous perception.

6. Conclusions and Outlook

This review has provided a comprehensive analysis of memristor-based spiking neuromorphic systems, highlighting their unique advantages in enabling brain-inspired perception and computation. By accurately replicating essential neuronal behaviors—such as temporal integration, threshold-triggered firing, spike-frequency adaptation, and stochasticity—memristors offer a compact, scalable, and energy-efficient alternative to conventional CMOS-based approaches. Their intrinsic nonlinear switching dynamics, analog conductance modulation, and compatibility with crossbar arrays further allow memristors to serve as both neurons and synapses within integrated SNN platforms.
We systematically summarized the underlying physical mechanisms of TSMs, critically examined representative spiking neuron implementations—including LIF, oscillatory, stochastic, and H-H models—and reviewed their applications across multimodal sensory systems such as vision, tactile sensing, thermosensation, and olfaction. These studies underscore the growing potential of memristor-based hardware in enabling near-sensor, bio-inspired computing at the edge.
Despite these advances, significant challenges persist, notably addressing variability across devices, enhancing the switching endurance, and mitigating timing dispersion in large-scale arrays. These non-idealities manifest as inconsistent thresholds, retention issues, and timing errors, which can compromise temporal precision in spiking networks. In crossbar architectures, sneak-path currents and read/write disturbances further impair long-term stability and energy efficiency. Addressing these limitations demands a multi-level co-optimization strategy—including material innovation (e.g., doped oxides and 2D heterostructures), selector integration, circuit-level compensation, and training algorithms resilient to hardware imperfections. Standardized benchmarking protocols and the statistical modeling of switching kinetics will also be critical for robust large-scale deployment. To advance toward practical implementation, future work should explore novel material stacks and advanced fabrication methods to mitigate device variability and enhance the cycle stability for large-scale integration. Co-design approaches that align device physics, circuit design, and network algorithms will be essential for achieving a real-time, adaptive performance in neuromorphic platforms.
As neuromorphic computing intersects with emerging domains such as intelligent robotics, brain–machine interfaces, and personalized healthcare, demands on low latency, environmental robustness, and on-device learning capabilities will intensify. Memristor-based neuromorphic systems, with their compact form factor and biologically plausible dynamics, could serve as foundational hardware for next-generation cognitive electronics. While broad deployment remains an ongoing challenge, the progress summarized in this review provides both a roadmap and a technical foundation for future innovations in brain-inspired, energy-efficient computation.

Funding

This work was supported by the National Natural Science Foundation of China (Grant 62404127).

Conflicts of Interest

The authors declare that they have no competing interests.

References

  1. Le, T.D.; Gia Nguyen, T.; Tran, T. The 1-Millisecond Challenge—Tactile Internet: From Concept to Standardization. J. Telecommun. Digit. Econ. 2020, 8, 56–93. [Google Scholar] [CrossRef]
  2. Falanga, D.; Kim, S.; Scaramuzza, D. How Fast Is Too Fast? The Role of Perception Latency in High-Speed Sense and Avoid. IEEE Rob. Autom. 2019, 4, 1884–1891. [Google Scholar] [CrossRef]
  3. Gallego, G.; Delbrück, T.; Orchard, G.; Bartolozzi, C.; Taba, B.; Censi, A.; Leutenegger, S.; Davison, A.J.; Conradt, J.; Daniilidis, K.; et al. Event-Based Vision: A Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 154–180. [Google Scholar] [CrossRef]
  4. Vitale, A.; Renner, A.; Nauer, C.; Scaramuzza, D.; Sandamirskaya, Y. Event-driven Vision and Control for UAVs on a Neuromorphic Chip. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 103–109. [Google Scholar]
  5. Huang, X.; Liu, C.; Jiang, Y.G.; Zhou, P. In-memory computing to break the memory wall. Chin. Phys. B 2020, 29, 78504. [Google Scholar] [CrossRef]
  6. Lu, M.; Christensen, C.N.; Weber, J.M.; Konno, T.; Läubli, N.F.; Scherer, K.M.; Avezov, E.; Lio, P.; Lapkin, A.A.; Kaminski Schierle, G.S.; et al. ERnet: A tool for the semantic segmentation and quantitative analysis of endoplasmic reticulum topology. Nat. Methods 2023, 20, 569–579. [Google Scholar] [CrossRef]
  7. Feng, L. Application Analysis of Artificial Intelligence Algorithms in Image Processing. Math. Probl. 2022, 2022, 7382938. [Google Scholar] [CrossRef]
  8. Bubeck, S.; Chandrasekaran, V.; Eldan, R.; Gehrke, J.; Horvitz, E.; Kamar, E.; Lee, P.; Lee, Y.T.; Li, Y.; Lundberg, S.; et al. Sparks of Artificial General Intelligence: Early experiments with GPT-4. arXiv 2023, arXiv:2303.12712. [Google Scholar]
  9. Zhou, F.; Chai, Y. Near-sensor and in-sensor computing. Nat. Electron. 2020, 3, 664–671. [Google Scholar] [CrossRef]
  10. Bullmore, E.; Sporns, O. The economy of brain network organization. Nat. Rev. Neurosci. 2012, 13, 336–349. [Google Scholar] [CrossRef]
  11. Li, Z.; Wei, T.; Beining, Z.; Rui, Y.; Miao, X. Emerging memristive neurons for neuromorphic computing and sensing. Sci. Technol. Adv. Mater. 2023, 24, 2188878. [Google Scholar] [CrossRef]
  12. Boahen, E.K.; Kweon, H.; Oh, H.; Kim, J.H.; Lim, H.; Kim, D.H. Bio-Inspired Neuromorphic Sensory Systems from Intelligent Perception to Nervetronics. Adv. Sci. 2025, 12, 2409568. [Google Scholar] [CrossRef]
  13. Wang, W.S.; Chen, X.L.; Huang, Y.J.; Huang, X.; Zhu, L.Q. Bionic Visual-Auditory Perceptual System Based on Ionotronic Neuromorphic Transistor for Information Encryption and Decryption with Sound Recognition Functions. Adv. Electron. Mater. 2025, 11, 2400642. [Google Scholar] [CrossRef]
  14. Davies, M.; Srinivasa, N.; Lin, T.H.; Chinya, G.; Cao, Y.; Choday, S.H.; Dimou, G.; Joshi, P.; Imam, N.; Jain, S.; et al. Loihi: A Neuromorphic Manycore Processor with On-Chip Learning. IEEE Micro 2018, 38, 82–99. [Google Scholar] [CrossRef]
  15. Chicca, E.; Badoni, D.; Dante, V.; D’Andreagiovanni, M.; Salina, G.; Carota, L.; Fusi, S.; Del Giudice, P.J.I.T.N.N. A VLSI recurrent network of integrate-and-fire neurons connected by plastic synapses with long-term memory. IEEE Trans. Neural Netw. 2003, 14, 1297–1307. [Google Scholar] [CrossRef] [PubMed]
  16. Jung, R.; Brauer, E.J.; Abbas, J.J. Real-time interaction between a neuromorphic electronic circuit and the spinal cord. IEEE Trans. Neural Netw. Learn. Syst. 2001, 9, 319–326. [Google Scholar] [CrossRef] [PubMed]
  17. Zhang, Y.; Wang, X.; Friedman, E.G. Memristor-Based Circuit Design for Multilayer Neural Networks. IEEE Trans. Circuits Syst. I: Regul. Pap. 2017, 65, 677–686. [Google Scholar] [CrossRef]
  18. Peng, H.; Gan, L.; Guo, X. Memristor-based spiking neural networks: Cooperative development of neural network architecture/algorithms and memristors. Chip 2024, 3, 100093. [Google Scholar] [CrossRef]
  19. Kumar, S.; Wang, X.; Strachan, J.P.; Yang, Y.; Lu, W.D. Dynamical memristors for higher-complexity neuromorphic computing. Nat. Rev. Mater. 2022, 7, 575–591. [Google Scholar] [CrossRef]
  20. Lanza, M.; Pazos, S.; Aguirre, F.; Sebastian, A.; Le Gallo, M.; Alam, S.M.; Ikegawa, S.; Yang, J.J.; Vianello, E.; Chang, M.F.; et al. The growing memristor industry. Nature 2025, 640, 613–622. [Google Scholar] [CrossRef]
  21. Pazos, S.; Xu, X.; Guo, T.; Zhu, K.; Alshareef, H.N.; Lanza, M. Solution-processed memristors: Performance and reliability. Nat. Rev. Mater. 2024, 9, 358–373. [Google Scholar] [CrossRef]
  22. Sung, S.H.; Kim, T.J.; Shin, H.; Im, T.H.; Lee, K.J. Simultaneous emulation of synaptic and intrinsic plasticity using a memristive synapse. Nat. Commun. 2022, 13, 2811. [Google Scholar] [CrossRef] [PubMed]
  23. Wang, S.; Gao, S.; Tang, C.; Occhipinti, E.; Li, C.; Wang, S.; Wang, J.; Zhao, H.; Hu, G.; Nathan, A.; et al. Memristor-based adaptive neuromorphic perception in unstructured environments. Nat. Commun. 2024, 15, 4671. [Google Scholar] [CrossRef] [PubMed]
  24. Xia, Z.; Sun, X.; Wang, Z.; Meng, J.; Jin, B.; Wang, T. Low-Power Memristor for Neuromorphic Computing: From Materials to Applications. Nano-Micro Lett. 2025, 17, 217. [Google Scholar] [CrossRef] [PubMed]
  25. Zhang, Y.; Wang, Z.; Zhu, J.; Yang, Y.; Rao, M.; Song, W.; Zhuo, Y.; Zhang, X.; Cui, M.; Shen, L.; et al. Brain-inspired computing with memristors: Challenges in devices, circuits, and systems. Appl. Phys. Rev. 2020, 7, 011308. [Google Scholar] [CrossRef]
  26. Kim, Y.; Baek, J.H.; Im, I.H.; Lee, D.H.; Park, M.H.; Jang, H.W. Two-Terminal Neuromorphic Devices for Spiking Neural Networks: Neurons, Synapses, and Array Integration. ACS Nano 2024, 18, 34531–34571. [Google Scholar] [CrossRef]
  27. Yang, J.-Q.; Wang, R.; Ren, Y.; Mao, J.-Y.; Wang, Z.-P.; Zhou, Y.; Han, S.-T. Neuromorphic Engineering: From Biological to Spike-Based Hardware Nervous Systems. Adv. Mater. 2020, 32, 2003610. [Google Scholar] [CrossRef]
  28. Pfeiffer, M.; Pfeil, T. Deep Learning With Spiking Neurons: Opportunities and Challenges. Front. Neurosci. 2018, 12, 774. [Google Scholar] [CrossRef]
  29. Bouvier, M.; Valentian, A.; Mesquida, T.; Rummens, F.; Reyboz, M.; Vianello, E.; Beigne, E. Spiking Neural Networks Hardware Implementations and Challenges: A Survey. Emerg. Technol. Comput. Syst. 2019, 15, 22. [Google Scholar] [CrossRef]
  30. Goodall, E.F.; Heath, P.R.; Bandmann, O.; Kirby, J.; Shaw, P.J. Neuronal dark matter: The emerging role of microRNAs in neurodegeneration. Front. Cell. Neurosci. 2013, 7, 178. [Google Scholar] [CrossRef]
  31. Tang, J.; Yuan, F.; Shen, X.; Wang, Z.; Rao, M.; He, Y.; Sun, Y.; Li, X.; Zhang, W.; Li, Y.; et al. Bridging Biological and Artificial Neural Networks with Emerging Neuromorphic Devices: Fundamentals, Progress, and Challenges. Adv. Mater. 2019, 31, 1902761. [Google Scholar] [CrossRef]
  32. Gentet, L.J.; Stuart, G.J.; Clements, J.D. Direct measurement of specific membrane capacitance in neurons. Biophys. J. 2000, 79, 314–320. [Google Scholar] [CrossRef]
  33. Eyal, G.; Verhoog, M.B.; Testa-Silva, G.; Deitcher, Y.; Benavides-Piccione, R.; DeFelipe, J.; de Kock, C.P.J.; Mansvelder, H.D.; Segev, I. Human Cortical Pyramidal Neurons: From Spines to Spikes via Models. Front. Cell. Neurosci. 2018, 12, 181. [Google Scholar] [CrossRef]
  34. Schiller, J.; Major, G.; Koester, H.J.; Schiller, Y. NMDA spikes in basal dendrites of cortical pyramidal neurons. Nature 2000, 404, 285–289. [Google Scholar] [CrossRef]
  35. Major, G.; Polsky, A.; Denk, W.; Schiller, J.; Tank, D.W. Spatiotemporally graded NMDA spike/plateau potentials in basal dendrites of neocortical pyramidal neurons. J. Neurophysiol. 2008, 99, 2584–2601. [Google Scholar] [CrossRef]
  36. Rasband, M.N.; Shrager, P. Ion channel sequestration in central nervous system axons. J. Physiol. 2000, 525 Pt 1, 63–73. [Google Scholar] [CrossRef]
  37. Garrett, B. PDA Connections: Mobile Technology for Health Care; Lippincott Williams and Wilkins: Philadelphia, PA, USA, 2007; ISBN 0-7817-5999-4. [Google Scholar]
  38. Bear, M.; Connors, B.; Paradiso, M. Neuroscience: Exploring the Brain, 3rd ed.; Jones & Bartlett Learning: Burlington, MA, USA, 2007. [Google Scholar]
  39. Bean, B.P. The action potential in mammalian central neurons. Nat. Rev. Neurosci. 2007, 8, 451–465. [Google Scholar] [CrossRef]
  40. Häusser, M. The Hodgkin-Huxley theory of the action potential. Nat. Neurosci. 2000, 3, 1165. [Google Scholar] [CrossRef] [PubMed]
  41. Lim, H.; Kornijcuk, V.; Seok, J.Y.; Kim, S.K.; Kim, I.; Hwang, C.S.; Jeong, D.S. Reliability of neuronal information conveyed by unreliable neuristor-based leaky integrate-and-fire neurons: A model study. Sci. Rep. 2015, 5, 9776. [Google Scholar] [CrossRef] [PubMed]
  42. Wang, Z.; Joshi, S.; Savel’ev, S.; Song, W.; Midya, R.; Li, Y.; Rao, M.; Yan, P.; Asapu, S.; Zhuo, Y.; et al. Fully memristive neural networks for pattern classification with unsupervised learning. Nat. Electron. 2018, 1, 137–145. [Google Scholar] [CrossRef]
  43. Stein, R.B. A Theoretical Analysis of Neuronal Variability. Biophys. J. 1965, 5, 173–194. [Google Scholar] [CrossRef]
  44. Dayan, P.; Abbott, L.F. Theoretical neuroscience: Computational and mathematical modeling of neural systems. J. Cogn. Neurosci. 2001, 15, 154–155. [Google Scholar]
  45. Dutta, S.; Kumar, V.; Shukla, A.; Mohapatra, N.R.; Ganguly, U. Leaky Integrate and Fire Neuron by Charge-Discharge Dynamics in Floating-Body MOSFET. Sci. Rep. 2017, 7, 8257. [Google Scholar] [CrossRef] [PubMed]
  46. Joubert, A.; Belhadj, B.; Temam, O.; Héliot, R. Hardware spiking neurons design: Analog or digital? In Proceedings of the 2012 International Joint Conference on Neural Networks (IJCNN), Brisbane, Australia, 10–15 June 2012; pp. 1–5. [Google Scholar]
  47. Fitzhugh, R. Impulses and Physiological States in Theoretical Models of Nerve Membrane. Biophys. J. 1961, 1, 445–466. [Google Scholar] [CrossRef] [PubMed]
  48. Bednar, J. Encyclopedia of Computational Neuroscience; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
  49. Rose, R.M.; Hindmarsh, J.L. The assembly of ionic currents in a thalamic neuron. I. The three-dimensional model. Proc. R. Soc. London. Ser. B Biol. Sci. 1989, 237, 267–288. [Google Scholar]
  50. Izhikevich, E.M. Simple model of spiking neurons. IEEE Trans. Neural Netw. 2003, 14, 1569–1572. [Google Scholar] [CrossRef]
  51. Hickmott, T.W. Low-Frequency Negative Resistance in Thin Anodic Oxide Films. J. Appl. Phys. 1962, 33, 2669–2682. [Google Scholar] [CrossRef]
  52. Strukov, D.B.; Snider, G.S.; Stewart, D.R.; Williams, R.S. The missing memristor found. Nature 2008, 453, 80–83. [Google Scholar] [CrossRef]
  53. Jo, S.H.; Chang, T.; Ebong, I.; Bhadviya, B.B.; Mazumder, P.; Lu, W. Nanoscale Memristor Device as Synapse in Neuromorphic Systems. Nano Lett. 2010, 10, 1297–1301. [Google Scholar] [CrossRef]
  54. Zhu, J.; Zhang, T.; Yang, Y.; Huang, R. A comprehensive review on emerging artificial neuromorphic devices. Appl. Phys. Rev. 2020, 7, 011312. [Google Scholar] [CrossRef]
  55. Prezioso, M.; Merrikh Bayat, F.; Hoskins, B.; Likharev, K.; Strukov, D. Self-Adaptive Spike-Time-Dependent Plasticity of Metal-Oxide Memristors. Sci. Rep. 2016, 6, 21331. [Google Scholar] [CrossRef]
  56. Li, Y.; Zhong, Y.; Zhang, J.; Xu, L.; Wang, Q.; Sun, H.; Tong, H.; Cheng, X.; Miao, X. Activity-Dependent Synaptic Plasticity of a Chalcogenide Electronic Synapse for Neuromorphic Systems. Sci. Rep. 2014, 4, 4906. [Google Scholar] [CrossRef]
  57. Ohno, T.; Hasegawa, T.; Tsuruoka, T.; Terabe, K.; Gimzewski, J.K.; Aono, M. Short-term plasticity and long-term potentiation mimicked in single inorganic synapses. Nat. Mater. 2011, 10, 591–595. [Google Scholar] [CrossRef]
  58. Choi, S.; Tan, S.H.; Li, Z.; Kim, Y.; Choi, C.; Chen, P.-Y.; Yeon, H.; Yu, S.; Kim, J. SiGe epitaxial memory for neuromorphic computing with reproducible high performance based on engineered dislocations. Nat. Mater. 2018, 17, 335–340. [Google Scholar] [CrossRef]
  59. Cai, F.; Correll, J.M.; Lee, S.H.; Lim, Y.; Bothra, V.; Zhang, Z.; Flynn, M.P.; Lu, W.D. A fully integrated reprogrammable memristor–CMOS system for efficient multiply–accumulate operations. Nat. Rev. Neurosci. 2019, 2, 290–299. [Google Scholar] [CrossRef]
  60. Janod, E.; Tranchant, J.; Corraze, B.; Querré, M.; Stoliar, P.; Rozenberg, M.; Cren, T.; Roditchev, D.; Phuoc, V.T.; Besland, M.-P.; et al. Resistive Switching in Mott Insulators and Correlated Systems. Adv. Funct. Mater. 2015, 25, 6287–6305. [Google Scholar] [CrossRef]
  61. Chudnovskii, F.A.; Odynets, L.L.; Pergament, A.L.; Stefanovich, G.B. Electroforming and Switching in Oxides of Transition Metals: The Role of Metal–Insulator Transition in the Switching Mechanism. J. Solid. State Chem. 1996, 122, 95–99. [Google Scholar] [CrossRef]
  62. Wang, Z.; Joshi, S.; Savel’ev, S.E.; Jiang, H.; Midya, R.; Lin, P.; Hu, M.; Ge, N.; Strachan, J.P.; Li, Z.; et al. Memristors with diffusive dynamics as synaptic emulators for neuromorphic computing. Nat. Electron. 2017, 16, 101–108. [Google Scholar] [CrossRef]
  63. Yi, W.; Tsang, K.K.; Lam, S.K.; Bai, X.; Crowell, J.A.; Flores, E.A. Biological plausibility and stochasticity in scalable VO(2) active memristor neurons. Nat. Commun. 2018, 9, 4661. [Google Scholar] [CrossRef]
  64. Pickett, M.D.; Medeiros-Ribeiro, G.; Williams, R.S. A scalable neuristor built with Mott memristors. Nat. Mater. 2013, 12, 114–117. [Google Scholar] [CrossRef]
  65. Sun, Y.; Song, C.; Yin, S.; Qiao, L.; Wan, Q.; Wang, R.; Zeng, F.; Pan, F. Design of a Controllable Redox-Diffusive Threshold Switching Memristor. Adv. Electron. Mater. 2020, 6, 2000695. [Google Scholar] [CrossRef]
  66. Zhang, X.; Wang, W.; Liu, Q.; Zhao, X.; Wei, J.; Cao, R.; Yao, Z.; Zhu, X.; Zhang, F.; Lv, H.; et al. An Artificial Neuron Based on a Threshold Switching Memristor. IEEE Electron. Device Lett. 2018, 39, 308–311. [Google Scholar] [CrossRef]
  67. Shi, K.; Heng, S.; Wang, X.; Liu, S.; Cui, H.; Chen, C.; Zhu, Y.; Xu, W.; Wan, C.; Wan, Q. An Oxide Based Spiking Thermoreceptor for Low-Power Thermography Edge Detection. IEEE Electron. Device Lett. 2022, 43, 2196–2199. [Google Scholar] [CrossRef]
  68. Lu, Y.F.; Li, Y.; Li, H.; Wan, T.Q.; Huang, X.; He, Y.H.; Miao, X. Low-Power Artificial Neurons Based on Ag/TiN/HfAlOx/Pt Threshold Switching Memristor for Neuromorphic Computing. IEEE Electron. Device Lett. 2020, 41, 1245–1248. [Google Scholar] [CrossRef]
  69. Waser, R.; Dittmann, R.; Staikov, G.; Szot, K. Redox-Based Resistive Switching Memories—Nanoionic Mechanisms, Prospects, and Challenges. Adv. Mater. 2009, 21, 2632–2663. [Google Scholar] [CrossRef] [PubMed]
  70. Yang, Y.; Gao, P.; Gaba, S.; Chang, T.; Pan, X.; Lu, W. Observation of conducting filament growth in nanoscale resistive memories. Nat. Commun. 2012, 3, 732. [Google Scholar] [CrossRef]
  71. Wang, Z.; Rao, M.; Midya, R.; Joshi, S.; Jiang, H.; Lin, P.; Song, W.; Asapu, S.; Zhuo, Y.; Li, C.; et al. Threshold Switching of Ag or Cu in Dielectrics: Materials, Mechanism, and Applications. Adv. Funct. Mater. 2018, 28, 1704862. [Google Scholar] [CrossRef]
  72. Hua, Q.; Wu, H.; Gao, B.; Zhang, Q.; Wu, W.; Li, Y.; Wang, X.; Hu, W.; Qian, H. Low-Voltage Oscillatory Neurons for Memristor-Based Neuromorphic Systems. Glob. Chall. 2019, 3, 1900015. [Google Scholar] [CrossRef]
  73. Wang, X.; Chen, C.; Zhu, L.; Shi, K.; Peng, B.; Zhu, Y.; Mao, H.; Long, H.; Ke, S.; Fu, C.; et al. Vertically integrated spiking cone photoreceptor arrays for color perception. Nat. Commun. 2023, 14, 3444. [Google Scholar] [CrossRef]
  74. Midya, R.; Wang, Z.; Asapu, S.; Joshi, S.; Li, Y.; Zhuo, Y.; Song, W.; Jiang, H.; Upadhay, N.; Rao, M.; et al. Artificial Neural Network (ANN) to Spiking Neural Network (SNN) Converters Based on Diffusive Memristors. Adv. Electron. Mater. 2019, 5, 1900060. [Google Scholar] [CrossRef]
  75. Madan, H.; Jerry, M.; Pogrebnyakov, A.; Mayer, T.; Datta, S. Quantitative mapping of phase coexistence in Mott-Peierls insulator during electronic and thermally driven phase transition. ACS Nano 2015, 9, 2009–2017. [Google Scholar] [CrossRef]
  76. Kumar, S.; Pickett, M.D.; Strachan, J.P.; Gibson, G.; Nishi, Y.; Williams, R.S. Local temperature redistribution and structural transition during joule-heating-driven conductance switching in VO2. Adv. Mater. 2013, 25, 6128–6132. [Google Scholar] [CrossRef]
  77. Stoliar, P.; Tranchant, J.; Corraze, B.; Janod, E.; Besland, M.-P.; Tesler, F.; Rozenberg, M.; Cario, L. A Leaky-Integrate-and-Fire Neuron Analog Realized with a Mott Insulator. Adv. Funct. Mater. 2017, 27, 1604740. [Google Scholar] [CrossRef]
  78. Tuma, T.; Pantazi, A.; Le Gallo, M.; Sebastian, A.; Eleftheriou, E. Stochastic phase-change neurons. Nat. Commun. 2016, 11, 693–699. [Google Scholar] [CrossRef] [PubMed]
  79. Chen, C.; He, Y.; Mao, H.; Zhu, L.; Wang, X.; Zhu, Y.; Zhu, Y.; Shi, Y.; Wan, C.; Wan, Q. A Photoelectric Spiking Neuron for Visual Depth Perception. Adv. Mater. 2022, 34, 2201895. [Google Scholar] [CrossRef] [PubMed]
  80. Gao, L.; Chen, P.; Yu, S. NbOx based oscillation neuron for neuromorphic computing. Appl. Phys. Lett. 2017, 111, 103503. [Google Scholar] [CrossRef]
  81. Yuan, R.; Tiw, P.J.; Cai, L.; Yang, Z.; Liu, C.; Zhang, T.; Ge, C.; Huang, R.; Yang, Y. A neuromorphic physiological signal processing system based on VO2 memristor for next-generation human-machine interface. Nat. Commun. 2023, 14, 3695. [Google Scholar] [CrossRef]
  82. Wang, K.; Hu, Q.; Gao, B.; Lin, Q.; Zhuge, F.; Zhang, D.-Y.; Wang, L.; He, Y.; Scheicher, R.; Tong, H.; et al. Threshold switching memristor-based stochastic neurons for probabilistic computing. Mater. Horiz. 2021, 8, 619–629. [Google Scholar] [CrossRef]
  83. Mao, H.; He, Y.; Chen, C.; Zhu, L.; Zhu, Y.; Zhu, Y.; Ke, S.; Wang, X.; Wan, C.; Wan, Q. A Spiking Stochastic Neuron Based on Stacked InGaZnO Memristors. Adv. Electron. Mater. 2022, 8, 2100918. [Google Scholar] [CrossRef]
  84. Kim, G.; In, J.H.; Lee, Y.; Rhee, H.; Park, W.; Song, H.; Park, J.; Jeon, J.B.; Brown, T.D.; Talin, A.A.; et al. Mott neurons with dual thermal dynamics for spatiotemporal computing. Nat. Mater. 2024, 23, 1237–1244. [Google Scholar] [CrossRef]
  85. Xiao, Y.; Liu, Y.; Zhang, B.; Chen, P.; Zhu, H.; He, E.; Zhao, J.; Huo, W.; Jin, X.; Zhang, X.; et al. Bio-plausible reconfigurable spiking neuron for neuromorphic computing. Sci. Adv. 2025, 11, eadr6733. [Google Scholar] [CrossRef]
  86. Wu, Q.; Dang, B.; Lu, C.; Xu, G.; Yang, G.; Wang, J.; Chuai, X.; Lu, N.; Geng, D.; Wang, H.; et al. Spike Encoding with Optic Sensory Neurons Enable a Pulse Coupled Neural Network for Ultraviolet Image Segmentation. Nano Lett. 2020, 20, 8015–8023. [Google Scholar] [CrossRef]
  87. Li, Z.; Li, Z.; Tang, W.; Yao, J.; Dou, Z.; Gong, J.; Li, Y.; Zhang, B.; Dong, Y.; Xia, J.; et al. Crossmodal sensory neurons based on high-performance flexible memristors for human-machine in-sensor computing system. Nat. Commun. 2024, 15, 7275. [Google Scholar] [CrossRef]
  88. Sillar, K.T.; Roberts, A. A neuronal mechanism for sensory gating during locomotion in a vertebrate. Nature 1988, 331, 262–265. [Google Scholar] [CrossRef]
  89. Tan, H.; Zhou, Y.; Tao, Q.; Rosen, J.; van Dijken, S. Bioinspired multisensory neural network with crossmodal integration and recognition. Nat. Commun. 2021, 12, 1120. [Google Scholar] [CrossRef] [PubMed]
  90. Merolla, P.A.; Arthur, J.V.; Alvarez-Icaza, R.; Cassidy, A.S.; Sawada, J.; Akopyan, F.; Jackson, B.L.; Imam, N.; Guo, C.; Nakamura, Y.; et al. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 2014, 345, 668–673. [Google Scholar] [CrossRef] [PubMed]
  91. Wan, C.; Cai, P.; Wang, M.; Qian, Y.; Huang, W.; Chen, X. Artificial Sensory Memory. Adv. Mater. 2020, 32, 1902434. [Google Scholar] [CrossRef] [PubMed]
  92. Wan, C.; Pei, M.; Shi, K.; Cui, H.; Long, H.; Qiao, L.; Xing, Q.; Wan, Q. Toward a Brain–Neuromorphics Interface. Adv. Mater. 2024, 36, 2311288. [Google Scholar] [CrossRef]
  93. Fu, T.; Liu, X.; Fu, S.; Woodard, T.; Gao, H.; Lovley, D.R.; Yao, J. Self-sustained green neuromorphic interfaces. Nat. Commun. 2021, 12, 3351. [Google Scholar] [CrossRef]
  94. Yang, S.H.; Kim, K.B.; Kim, E.J.; Baek, K.H.; Kim, S. An ultra low power CMOS motion detector. IEEE Trans. Consum. Electron. 2009, 55, 2425–2430. [Google Scholar] [CrossRef]
  95. Chen, T.; Lu, S. Object-Level Motion Detection From Moving Cameras. IEEE Trans. Circuits Syst. Video Technol. 2017, 27, 2333–2343. [Google Scholar] [CrossRef]
  96. Huang, H.; Liang, X.; Wang, Y.; Tang, J.; Li, Y.; Du, Y.; Sun, W.; Zhang, J.; Yao, P.; Mou, X.; et al. Fully integrated multi-mode optoelectronic memristor array for diversified in-sensor computing. Nat. Nanotechnol. 2025, 20, 93–103. [Google Scholar] [CrossRef]
  97. Lee, D.; Park, M.; Baek, Y.; Bae, B.; Heo, J.; Lee, K. In-sensor image memorization and encoding via optical neurons for bio-stimulus domain reduction toward visual cognitive processing. Nat. Commun. 2022, 13, 5223. [Google Scholar] [CrossRef] [PubMed]
  98. King, T. Human color perception, cognition, and culture: Why “Red” is always red. Proc. SPIE Int. Soc. Opt. Eng. 2004, 5667, 234–242. [Google Scholar] [CrossRef]
  99. Lee, L.P.; Szema, R. Inspirations from Biological Optics for Advanced Photonic Systems. Science 2005, 310, 1148–1150. [Google Scholar] [CrossRef] [PubMed]
  100. Zhou, Y.; Fu, J.; Chen, Z.; Zhuge, F.; Wang, Y.; Yan, J.; Ma, S.; Xu, L.; Yuan, H.; Chan, M.; et al. Computational event-driven vision sensors for in-sensor spiking neural networks. Nat. Electron. 2023, 6, 870–878. [Google Scholar] [CrossRef]
  101. Li, D.; Liu, G.; Li, F.; Ren, H.; Tang, Y.; Chen, Y.; Wang, Y.; Wang, R.; Wang, S.; Xing, L.; et al. Double-opponent spiking neuron array with orientation selectivity for encoding and spatial-chromatic processing. Sci. Adv. 2025, 11, eadt3584. [Google Scholar] [CrossRef]
  102. Zhang, Y.; Zhong, S.; Song, L.; Ji, X.; Zhao, R. Emulating dynamic synaptic plasticity over broad timescales with memristive device. Appl. Phys. Lett. 2018, 113, 203102. [Google Scholar] [CrossRef]
  103. Wang, Y.-F.; Lin, Y.-C.; Wang, I.T.; Lin, T.-P.; Hou, T.-H. Characterization and Modeling of Nonfilamentary Ta/TaOx/TiO2/Ti Analog Synaptic Device. Sci. Rep. 2015, 5, 10150. [Google Scholar] [CrossRef]
  104. Zhang, X.; Zhuo, Y.; Luo, Q.; Wu, Z.; Midya, R.; Wang, Z.; Song, W.; Wang, R.; Upadhyay, N.K.; Fang, Y.; et al. An artificial spiking afferent nerve based on Mott memristors for neurorobotics. Nat. Commun. 2020, 11, 51. [Google Scholar] [CrossRef]
  105. Li, F.; Wang, R.; Song, C.; Zhao, M.; Ren, H.; Wang, S.; Liang, K.; Li, D.; Ma, X.; Zhu, B.; et al. A Skin-Inspired Artificial Mechanoreceptor for Tactile Enhancement and Integration. ACS Nano 2021, 15, 16422–16431. [Google Scholar] [CrossRef]
  106. Yang, K.; Wang, Y.; Tiw, P.J.; Wang, C.; Zou, X.; Yuan, R.; Liu, C.; Li, G.; Ge, C.; Wu, S.; et al. High-order sensory processing nanocircuit based on coupled VO2 oscillators. Nat. Commun. 2024, 15, 1693. [Google Scholar] [CrossRef]
  107. Wu, X.; Zhu, J.; Evans, J.W.; Lu, C.; Arias, A.C. A Potentiometric Electronic Skin for Thermosensation and Mechanosensation. Adv. Funct. Mater. 2021, 31, 2010824. [Google Scholar] [CrossRef]
  108. Bhatnagar, P.; Hong, J.; Patel, M.; Kim, J. Transparent photovoltaic skin for artificial thermoreceptor and nociceptor memory. Nano Energy 2022, 91, 106676. [Google Scholar] [CrossRef]
  109. Pei, Y.; Yang, B.; Zhang, X.; He, H.; Sun, Y.; Zhao, J.; Chen, P.; Wang, Z.; Sun, N.; Liang, S.; et al. Ultra robust negative differential resistance memristor for hardware neuron circuit implementation. Nat. Commun. 2025, 16, 48. [Google Scholar] [CrossRef] [PubMed]
  110. Wang, T.; Wang, X.-X.; Wen, J.; Shao, Z.-Y.; Huang, H.-M.; Guo, X. A Bio-Inspired Neuromorphic Sensory System. Adv. Intell. Syst. 2022, 4, 2200047. [Google Scholar] [CrossRef]
  111. Stein, B.E.; Stanford, T.R.; Rowland, B.A. Development of multisensory integration from the perspective of the individual neuron. Nat. Rev. Neurosci. 2014, 15, 520–535. [Google Scholar] [CrossRef]
  112. Pearson, J. The human imagination: The cognitive neuroscience of visual mental imagery. Nat. Rev. Neurosci. 2019, 20, 624–634. [Google Scholar] [CrossRef]
  113. Churchland, A.K. Normalizing relations between the senses. Nat. Neurosci. 2011, 14, 672–673. [Google Scholar] [CrossRef]
  114. Yuan, R.; Duan, Q.; Tiw, P.J.; Li, G.; Xiao, Z.; Jing, Z.; Yang, K.; Liu, C.; Ge, C.; Huang, R.; et al. A calibratable sensory neuron based on epitaxial VO2 for spike-based neuromorphic multisensory system. Nat. Commun. 2022, 13, 3973. [Google Scholar] [CrossRef]
  115. Zhu, J.; Zhang, X.; Wang, R.; Wang, M.; Chen, P.; Cheng, L.; Wu, Z.; Wang, Y.; Liu, Q.; Liu, M. A Heterogeneously Integrated Spiking Neuron Array for Multimode-Fused Perception and Object Classification. Adv. Mater. 2022, 34, 2200481. [Google Scholar] [CrossRef]
  116. Yang, Y.; Zhu, F.; Zhang, X.; Chen, P.; Wang, Y.; Zhu, J.; Ding, Y.; Cheng, L.; Li, C.; Jiang, H.; et al. Firing feature-driven neural circuits with scalable memristive neurons for robotic obstacle avoidance. Nat. Commun. 2024, 15, 4318. [Google Scholar] [CrossRef]
Figure 1. (a) Schematic illustration of the connectivity between neurons. (b) Schematic of action potential integration within a neuron.
Figure 1. (a) Schematic illustration of the connectivity between neurons. (b) Schematic of action potential integration within a neuron.
Nanomaterials 15 01130 g001
Figure 2. (a) Redox-type switching mechanism. (b) Typical I–V characteristics of a redox-type volatile threshold-switching memristor (TSM) under different current compliance levels, with each color corresponding to a specific compliance setting. (c) Mott-transition mechanism. (d) Typical I–V characteristics of a VO2 Mott TSM. Image (c) is reproduced with permission [65]. Copyright 2020, Wiley-VCH. Image (d) is reproduced with permission [63]. Copyright 2018, Springer Nature.
Figure 2. (a) Redox-type switching mechanism. (b) Typical I–V characteristics of a redox-type volatile threshold-switching memristor (TSM) under different current compliance levels, with each color corresponding to a specific compliance setting. (c) Mott-transition mechanism. (d) Typical I–V characteristics of a VO2 Mott TSM. Image (c) is reproduced with permission [65]. Copyright 2020, Wiley-VCH. Image (d) is reproduced with permission [63]. Copyright 2018, Springer Nature.
Nanomaterials 15 01130 g002
Figure 3. TSM-based neuron circuits. (a) Circuit diagrams of oscillatory neurons driven by current and (b) voltage, respectively. (c) Leaky integrate-and-fire (LIF) neuron circuit employing a TSM device and resistive elements. (d) Experimental response of the LIF circuit showing clear integrate-and-fire behavior. (e) Equivalent circuit of the Hodgkin–Huxley (H-H) neuron model. (f) All-or-none firing behavior of the neuron. Images (c,d) are reproduced with permission [79]. Copyright 2022, Wiley-VCH. Images (e,f) are reproduced with permission [64]. Copyright 2013, Springer Nature.
Figure 3. TSM-based neuron circuits. (a) Circuit diagrams of oscillatory neurons driven by current and (b) voltage, respectively. (c) Leaky integrate-and-fire (LIF) neuron circuit employing a TSM device and resistive elements. (d) Experimental response of the LIF circuit showing clear integrate-and-fire behavior. (e) Equivalent circuit of the Hodgkin–Huxley (H-H) neuron model. (f) All-or-none firing behavior of the neuron. Images (c,d) are reproduced with permission [79]. Copyright 2022, Wiley-VCH. Images (e,f) are reproduced with permission [64]. Copyright 2013, Springer Nature.
Nanomaterials 15 01130 g003
Figure 4. (a) Schematic illustration of various ultraviolet (UV) signals (left) and the artificial sensory neuron. (b) Experimentally observed neuronal spiking at four distinct frequencies under different UV stimuli in the oscillatory neuron. (c) Structural diagram of the artificial spiking cone photoreceptor (VISCP) array based on an ITO/TaOX/Ag/IGZO/ITO stack. (d) Spike frequency response curves of the VISCP array showing wavelength-dependent selectivity. Reproduced with permission. (e) Schematic of an artificial visual spiking neuron composed of a synaptic phototransistor (BP RPT) and an NbOX memristor. (f) VGS tuned spiking neuron array configured as a directional NIR–UV DORF module for parallel spike encoding and signal preprocessing. Images (a,b) are reproduced with permission [86]. Copyright 2020, American Chemical Society. Images (c,d) are reproduced with permission [73]. Copyright 2023, Springer Nature. Images (e,f) are reproduced with permission [101]. Copyright 2025, Springer Nature.
Figure 4. (a) Schematic illustration of various ultraviolet (UV) signals (left) and the artificial sensory neuron. (b) Experimentally observed neuronal spiking at four distinct frequencies under different UV stimuli in the oscillatory neuron. (c) Structural diagram of the artificial spiking cone photoreceptor (VISCP) array based on an ITO/TaOX/Ag/IGZO/ITO stack. (d) Spike frequency response curves of the VISCP array showing wavelength-dependent selectivity. Reproduced with permission. (e) Schematic of an artificial visual spiking neuron composed of a synaptic phototransistor (BP RPT) and an NbOX memristor. (f) VGS tuned spiking neuron array configured as a directional NIR–UV DORF module for parallel spike encoding and signal preprocessing. Images (a,b) are reproduced with permission [86]. Copyright 2020, American Chemical Society. Images (c,d) are reproduced with permission [73]. Copyright 2023, Springer Nature. Images (e,f) are reproduced with permission [101]. Copyright 2025, Springer Nature.
Nanomaterials 15 01130 g004
Figure 5. (a) Schematic of the circuit architecture for an artificial spiking mechanoreceptor system based on NbOx memristors. (b) Output spike signals from the artificial sensing system under varying pressure intensities and (c) their corresponding magnified view. (d) Schematic illustration of multi-source tactile signal integration, showing how biological tactile inputs from different stimuli are encoded into a single high-frequency spike train, and how this process is emulated by a memristor-based artificial mechanoreceptive system. (e) The spiking frequency increases when two pressure stimuli are applied simultaneously. (f) Schematic diagram of two capacitively coupled oscillatory neurons. (g) Spiking output results showing synchronization patterns and phase differences. (h) Curve of phase difference versus resistance under various coupling capacitances. (i) Illustration of a gesture recognition system implemented using the oscillatory coupling network. Images (ac) are reproduced with permission [104]. Copyright 2020, Springer Nature. Images (d,e) are reproduced with permission [105]. Copyright 2021, American Chemical Society. Images (fi) are reproduced with permission [106]. Copyright 2024, Springer Nature.
Figure 5. (a) Schematic of the circuit architecture for an artificial spiking mechanoreceptor system based on NbOx memristors. (b) Output spike signals from the artificial sensing system under varying pressure intensities and (c) their corresponding magnified view. (d) Schematic illustration of multi-source tactile signal integration, showing how biological tactile inputs from different stimuli are encoded into a single high-frequency spike train, and how this process is emulated by a memristor-based artificial mechanoreceptive system. (e) The spiking frequency increases when two pressure stimuli are applied simultaneously. (f) Schematic diagram of two capacitively coupled oscillatory neurons. (g) Spiking output results showing synchronization patterns and phase differences. (h) Curve of phase difference versus resistance under various coupling capacitances. (i) Illustration of a gesture recognition system implemented using the oscillatory coupling network. Images (ac) are reproduced with permission [104]. Copyright 2020, Springer Nature. Images (d,e) are reproduced with permission [105]. Copyright 2021, American Chemical Society. Images (fi) are reproduced with permission [106]. Copyright 2024, Springer Nature.
Nanomaterials 15 01130 g005
Figure 6. (a) Schematic of a neuromorphic multi-sensory system composed of sensors (pressure, light, temperature, and curvature) and artificial neurons, along with an illustration of a spiking neural network (SNN). (b) Architecture of the MFSN array based on pressure sensors and NbOx memristors, and schematic of the corresponding multi-sensory neuromorphic computing system. (c) Training accuracy versus training epochs under three different modes: pressure, temperature, and multimodal. (d) Diagram of a bio-inspired and artificial CSSN-based multi-sensory feedback system. (e) Photograph of a flexible integrated sensory feedback system. (f) Multi-sensory spiking responses of the CSSN to pressure and temperature stimuli. (a) Reproduced with permission [114]. Copyright 2022, Springer Nature. Images (b,c) are reproduced with permission [115]. Copyright 2022, Wiley-VCH. Images (df) are reproduced with permission [87]. Copyright 2024, Springer Nature.
Figure 6. (a) Schematic of a neuromorphic multi-sensory system composed of sensors (pressure, light, temperature, and curvature) and artificial neurons, along with an illustration of a spiking neural network (SNN). (b) Architecture of the MFSN array based on pressure sensors and NbOx memristors, and schematic of the corresponding multi-sensory neuromorphic computing system. (c) Training accuracy versus training epochs under three different modes: pressure, temperature, and multimodal. (d) Diagram of a bio-inspired and artificial CSSN-based multi-sensory feedback system. (e) Photograph of a flexible integrated sensory feedback system. (f) Multi-sensory spiking responses of the CSSN to pressure and temperature stimuli. (a) Reproduced with permission [114]. Copyright 2022, Springer Nature. Images (b,c) are reproduced with permission [115]. Copyright 2022, Wiley-VCH. Images (df) are reproduced with permission [87]. Copyright 2024, Springer Nature.
Nanomaterials 15 01130 g006
Table 1. Comparison of TSMs for spiking neurons.
Table 1. Comparison of TSMs for spiking neurons.
DeviceEndurance (Cycles)Spiking Rate (Hz)Energy per Spike (J)Ref.
HfAlOX memristor>2001–101.6 × 10−14[68]
TaOX memristor>5000.1–1200~50 × 10−12[73]
VO2 memristor>106~105~6.9 × 10−9[81]
NbOX memristor>700~106~10−3[86]
Flexible VO2 memristor>1012~104~4 × 10−9[87]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, X.; Zhu, Y.; Zhou, Z.; Chen, X.; Jia, X. Memristor-Based Spiking Neuromorphic Systems Toward Brain-Inspired Perception and Computing. Nanomaterials 2025, 15, 1130. https://doi.org/10.3390/nano15141130

AMA Style

Wang X, Zhu Y, Zhou Z, Chen X, Jia X. Memristor-Based Spiking Neuromorphic Systems Toward Brain-Inspired Perception and Computing. Nanomaterials. 2025; 15(14):1130. https://doi.org/10.3390/nano15141130

Chicago/Turabian Style

Wang, Xiangjing, Yixin Zhu, Zili Zhou, Xin Chen, and Xiaojun Jia. 2025. "Memristor-Based Spiking Neuromorphic Systems Toward Brain-Inspired Perception and Computing" Nanomaterials 15, no. 14: 1130. https://doi.org/10.3390/nano15141130

APA Style

Wang, X., Zhu, Y., Zhou, Z., Chen, X., & Jia, X. (2025). Memristor-Based Spiking Neuromorphic Systems Toward Brain-Inspired Perception and Computing. Nanomaterials, 15(14), 1130. https://doi.org/10.3390/nano15141130

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop