Abstract
Multidimensional force sensors are key devices capable of simultaneously perceiving and analyzing force in multiple directions (normally triaxial forces). They are designed to provide intelligent systems with skin-like precision in environmental interaction, offering high sensitivity, spatial resolution, decoupling capability, and environmental adaptability. However, the inherent complexity of tactile information coupling, combined with stringent demands for miniaturization, robustness, and low cost in practical applications, makes high-performance and reliable multidimensional sensing and decoupling a major challenge. This drives ongoing innovation in sensor structural design and sensing mechanisms. Various structural strategies have demonstrated significant advantages in improving sensor performance, simplifying decoupling algorithms, and enhancing adaptability—attributes that are essential in scenarios requiring fine physical interactions. From this perspective, this article reviews recent advances in multidimensional force sensing technology, with a focus on the operating principles and performance characteristics of sensors with different structural designs. It also highlights emerging trends toward multimodal sensing and the growing integration with system architectures and artificial intelligence, which together enable higher-level intelligence. These developments support a wide range of applications, including intelligent robotic manipulation, natural human–computer interaction, wearable health monitoring, and precision automation in agriculture and industry. Finally, the article discusses remaining challenges and future opportunities in the development of multidimensional force sensors.
1. Introduction
Tactile sensing, as a fundamental sensory modality for human interaction with the environment, plays a crucial role in fine manipulation and natural human–computer interaction [1,2,3,4,5,6]. Human skin exhibits high sensitivity to various mechanical stimuli, including normal and shear forces, thereby enabling precise perception in complex environments [7,8,9,10,11]. In recent years, the rapid development in flexible technology (e.g., electronic skin [12,13,14], electronic textiles for healthcare and printed electronics [15,16,17], as well as energy harvesting and interactive textile devices [18,19], etc.) has led to significant advances in tactile sensing. Primary sensing mechanisms include resistive sensing based on pressure-induced contact resistance variations [15,20,21,22], and capacitive sensing utilizing changes in inter-electrode capacitance caused by distance/area variations [23,24,25], as demonstrated in the additively manufactured micro-lattice dielectrics for multiaxial capacitive sensors [26]. Sensors based on resistive and capacitive are passive sensors can response to both static and dynamic stimuli, but they require external power supply for operation. On the other hand, piezoelectric [27,28,29] and triboelectric mechanism [30,31,32] are self-powered sensing mechanism that can directly generate electrical output signals under applied mechanical stimuli without power supply, but they normally can only response to dynamic stimuli which limits their applications in certain scenarios. The piezoelectric mechanism is based on the well-known piezoelectric effect, while the triboelectric mechanism is based on the coupling effect of contact electrification and electrostatic induction during the materials’ contact-separation processes. Furthermore, the integration of flexible tactile sensors with artificial intelligence has propelled the realization of intelligent robotics technology (focusing on bionic tactile perception [33,34,35], softness detection [36], and adaptive grasping [37,38,39,40], etc.), as well as intelligent wearable interaction systems (encompassing gesture and force decoding [41], scalable and high-resolution tactile skins [42], tactile interfaces [43,44,45], and health monitoring [46,47], etc.). Owing to the combined advancements in flexible polymer substrates and microstructural design, the latest generation of tactile sensors overcomes the physical constraints of traditional rigid devices and enhances both wearing comfort and system integration through their stretchable, bendable, and conformal characteristics [48,49,50].
However, most current flexible tactile sensors struggle to fully achieve the synergistic sensing capability of normal and shear forces found in human skin. This deficiency in sensing dimensions leads to lower recognition accuracy of key tactile parameters such as contact direction and sliding tendency, which severely limits their application efficiency in complex scenarios such as precision grasping and dynamic interaction [51,52,53]. The key bottleneck lies in the manufacturing complexity of multidimensional force sensors. These typically require intricate multilayer structures, precise alignment of sensing elements, and integration of heterogeneous materials—factors that collectively result in high production costs and low yield rates [54]. Moreover, achieving accurate force decoupling remains a fundamental challenge. While some studies claim multidimensional force resolution capabilities, most empirical demonstrations only showcase basic multidimensional force measurement functions without rigorously quantifying crosstalk or directional interference [55,56]. This decoupling complexity stems from the inherent coupling characteristics of mechanical responses within multi-axis sensors. External forces applied in one direction induce stray signals in other directions, necessitating advanced compensation through hardware design or algorithmic strategies [57]. Moreover, commercially available force sensors often face additional limitations such as high manufacturing costs, limited scalability for large-area applications, and insufficient durability under repeated mechanical loading. All of these problems hinder their widespread adoption in real-world scenarios. To break through this bottleneck, the multidimensional force sensing paradigm has emerged. It can simultaneously capture the vector components of normal and shear forces (Fz/Fx/Fy), thereby enabling comprehensive analysis of contact intensity and dynamic behavior. This technique has shown marked advantages in adaptive grasping manipulators [5,58], as also supported by vision-based force estimation in compliant hands [59], high-precision human–computer interfaces [60,61], and multidimensional biomechanical feedback systems [62,63].
Decoupling the forces in each direction is an important aspect of fully reconstructing the tactile information in multidimensional force sensing. Currently, mainstream decoupling strategies can be categorized into three types. The first type is structural decoupling, which achieves mechanical separation through spatially arranged anisotropic sensing units; although this approach ensures high signal fidelity, it presents challenges related to structural complexity [64,65]. The second type is algorithmic decoupling, which relies on machine learning or matrix inversion for post-processing. Although this approach simplifies the hardware design, it is constrained by data dependency and limited model generalization capabilities. Representative studies include sensors based on dielectric elastomer, electromagnetic, or magnetic principles [66,67,68], classic strain-gauge structures with error compensation [69], and machine learning decoupling methods such as hybrid 3D printed sensors and neural network models [70,71]. The third type is physical field decoupling, in which magnetization mode regulation is innovatively employed to decouple the mechanical response in a triaxial magnetic field, thereby reducing algorithmic complexity and improving decoupling efficiency [58,68,72]. Notably, each of these strategies involves significant trade-offs between decoupling accuracy, structural complexity, and system robustness, posing a persistent scientific challenge in this field. Furthermore, the human tactile system can perceive not only the magnitude and direction of force, but also simultaneously sense various physical information such as temperature, humidity, and material properties [8,73,74,75]. This multimodal perception capability provides a critical foundation for tactile embodied intelligence—where intelligent behavior emerges from the direct physical interaction between the sensing agent and its environment—to achieve environmental understanding and interaction. Therefore, a major future development direction for multidimensional force sensors is to expand from a single mechanical modality to the synergistic perception of multi-physical quantities, in order to enhance their human-like characteristics and environmental adaptability [76,77].
As shown in Figure 1, this paper systematically reviews the technological development of multidimensional force sensors from the perspective of structural innovation and provides an in-depth analysis of the latest research advances in multimodal multidimensional force sensors to enhance sensor integration levels and scenario adaptability. Next, it summarizes typical application scenarios for multidimensional force sensors, fully demonstrating their broad application potential in intelligent robotic manipulation and interaction, human–machine interaction and wearable health monitoring and agricultural and industrial inspection automation. Finally, the review provides an outlook on the future development of multidimensional force sensing in multimodal fusion, high-density integration, in sensor computing, and embodied intelligence, which emphasizes the tight coupling of perception, computation, and action within the physical embodiment of the sensor system itself.
2. Structural Design Strategies of Multidimensional Force Sensors
As the key enabler for multidimensional force sensors, the design and optimization of the sensor structure directly determine its core performance indicators, such as the number of sensing dimensions, sensitivity, range, decoupling characteristics, spatial resolution, dynamic response, robustness, and integration level. This section will systematically review and deeply analyze the main implementation path in the field of multidimensional force sensors-structural design innovation. Based on the significant characteristics of the overall sensor configuration and the differences in design concepts, as shown in Figure 2, they are categorized into five representative structural paradigms: in-plane segment design with the functional structures arranged in a single planar layer [53,55,64,78,79,80], multilayer stacking design with functional sensing elements vertically stacked in multiple distinct layers design [60,62,66,81,82,83], three-dimensional (3D) configuration design with sensing structures arranged in spatial architecture [8,84,85,86,87,88], split-type structure design with the force application point physically separated from the primary sensing units [58,68,89,90,91,92], and other structure design encompassing non-typical and innovative configurations that do not fit neatly into the above categories [93,94,95,96,97,98]. The comparative principle, advantages and disadvantages of the above five structural designs are listed in Table 1. By providing a detailed explanation and comparative analysis of the design principles, typical examples, and performance characteristics, we aim to offer a comprehensive technical picture of how multidimensional tactile sensors achieve multidimensional force sensing.
Figure 2.
Five representative structural paradigms of multidimensional force sensors: in-plane segment, multilayer stacking, 3D configuration, split-type structure and other structure.
2.1. In-Plane Segment Design
The in-plane design strategy enables multidimensional force detection using conventional pressure sensors equipped with an efficient force-transfer structure. This approach simplifies the structure and eliminates interlayer crosstalk. Its core principle involves arranging multiple force-sensitive elements (normally four elements) within a single planar layer and integrating a force-transfer mechanism above them. Under normal force, the load is distributed evenly across all sensing elements, resulting in identical outputs. Under shear force, however, elements at the front experience greater force and produce higher outputs. The magnitude and direction of shear forces can then be deduced from output differences. This design avoids the complexity of multilayer stacking, thereby facilitating miniaturization, flexibility, and large-scale array integration. However, since each sensing pixel requires multiple in-plane elements, achieving very high spatial resolution remains challenging.
Figure 3a proposes a flexible tactile e-skin based on a carbon nanotube/polydimethylsiloxane (CNTs/PDMS) porous nanocomposite [55]. This design leverages the in-plane segment design, coupled with four symmetrically distributed sensing units to decouple triaxial forces. Under normal pressure, all four elements exhibit synchronized resistance reduction, while under shear forces, the element along the force direction shows significant resistance decrease whereas its counterpart remains unchanged. It achieves sensitivities of 12.1 kPa−1 and 59.9 N−1, with a rapid response time of 3.1 ms. Also employing piezoresistive sensing but with a distinct microstructure strategy, Figure 3b features a cross-shaped configuration of microcracked channels as the core sensing layer, integrated with a top biomimetic bristle structure [99]. The microcracked channels are fabricated by pre-bending brittle conductive carbon paste and distributed along four orthogonal directions; their resistance changes directly respond to strain distribution differences. Shear forces cause specific channels to compress (resistance decrease) or stretch (resistance increase), whereas normal forces induce simultaneous microcrack separation in all four channels (resistance increase). The bristles amplify minute external forces via a cantilever effect, transmitting them to the microcracked region, thereby boosting sensitivity to 25.76 N−1 and achieving a low detection limit of 5.4 mN.
Figure 3.
In-plane segment design of multidimensional force sensors. (a) Flexible electronic skin based on double-sided rough porous CNTs/PDMS [55]. (b) Bionic tactile sensor based on a synergistic structure of planar intersecting microcrack channels and a central bristle [99]. (c) Finger-inspired rigid-flex hybrid piezoelectric tactile sensor [78]. (d) Scalable tactile sensor array based on ZnO piezoelectric thin-film transistors [79]. (e) Biomimetic electronic skin for multi-mechanical stimulus recognition [64]. (f) Dual-mode tactile sensor with piezoelectric-piezoresistive synergy [80].
Diverging from the piezoresistive mechanism above, Figure 3c, inspired by the finger bone-muscle structure, presents a rigid-flexible hybrid piezoelectric tactile sensor (RSHTS) [78]. Rigid epoxy pillars are embedded within a flexible PDMS dome top layer, resting on a soft silicone substrate. External forces applied through the pillars activate the d31 piezoelectric mode of the polyvinylidene fluoride (PVDF) sensing layer, instead of the conventional d33 mode. This design elevates sensitivity to 346.5 pC N−1 and achieves a frequency response range of 5–600 Hz. A four-electrode layout, combined with dome deformation, enables triaxial force recognition, overcoming the inherent shear forces insensitivity of single-layer piezoelectric devices. Similarly, Figure 3d leverages the integrated characteristics of zinc oxide (ZnO) thin-film transistors (TFTs) to achieve signal sensing, amplification, and multiplexing in a single step [79]. A dual-gate ZnO TFT array modulates channel current via piezoelectric charge, while a surface PDMS micropillar array converts shear forces into normal stress distribution differences. This configuration enables real-time resolution of multidimensional force inputs at a spatial resolution of 100 μm and a refresh rate of 100 Hz.
Turning to more integrated solutions, Figure 3e proposes a biomimetic e-skin that decouples multidimensional force through unique microstructural design [64]. A resistive shear forces sensor employs a suspended PDMS isolation layer and rigid polyethylene terephthalate (PET) structures, allowing four directional sensors to respond independently with a sensitivity of 0.1 N−1; A capacitive pressure sensor combines hemispherical ellipsoidal (H-E) microstructures with a wrinkled dielectric layer, achieving a sensitivity of 3.78 kPa−1. These are arranged via mechanical isolation layout (non-overlapping shear forces array, central pressure array) to enable independent perception and directional recognition of shear and pressure within a single planar layer. Also achieving multimodal perception and high-precision decoupling, Figure 3f develops a dual-mode tactile sensor mimicking human skin’s slow-adapting (SA) and fast-adapting (FA) receptor functions, integrating piezoresistive and piezoelectric modules [80]. A top protruding structure transmits triaxial forces, while a four-electrode array decouples normal and shear forces with a decoupling accuracy of 90.5%. Texture recognition is further achieved via spectral analysis and deep learning.
In short, various tactile sensing mechanisms based on an in-plane segmentation strategy achieve efficient decoupling and perception of multidimensional force within a single planar layer through meticulously designed force transfer structures. This design not only ensures structural simplification, flexibility, miniaturization, and ease of array integration for the device, but also significantly enhances sensing performance and expands functionality.
2.2. Multilayer Stacking Design
The multilayer stacking design achieves physical isolation and collaborative decoupling of multidimensional force signals via vertical integration of multiple functional sensing layers. Its core principle relies on the distinct responses of different functional layers to normal and shear forces to separate signals. Normal forces typically produce vertical compression, resulting in synchronous responses across all sensing units or specific individual variations. In contrast, shear forces mainly cause relative interlayer displacement or localized deformation, leading only to output changes in specific sensing units or signal modulation in designated electrode pairs. This approach effectively mitigates coupling between normal and shear forces, significantly improving directional recognition accuracy and crosstalk resistance. However, it also introduces challenges such as stringent interlayer alignment, increased fabrication complexity, potential interfacial crosstalk, and greater overall thickness—which may limit applications in flexible and ultrathin devices.
For layering within a single sensor, capacitive sensing mechanisms are commonly employed. For example, as shown in Figure 4a, inspired by the skin’s multilayered structure, a hydrogel-based ionic sensor was proposed [60]. The top encapsulation layer, elastic spacer layer, and base layer are interlocked via mortise-and-tenon joints to prevent interlayer misalignment and enhance mechanical robustness. An asymmetric electrode configuration consisting of a shared top electrode and three independent bottom electrodes forms a capacitive set. Normal forces cause all capacitances to increase synchronously, while shear forces alter only specific capacitance values, enabling omnidirectional force decoupling.
Figure 4.
Multilayer stacking design of multidimensional force Sensors. (a) Hierarchical interlocking triaxial force sensor based on hydrogel ionic conductors [60]. (b) Porous dielectric layer flexible multi-axis tactile sensor [66]. (c) Triaxial force sensor with micro-conical dielectric layer [81]. (d) Biomimetic interlocking structure electronic skin [82]. (e) Triboelectric-capacitive coupled wireless sensor [83]. (f) Skin-muscle inspired orthogonally integrated sensor [62].
To enhance performance in such capacitive sensing, microstructured dielectric layers are typically used. For instance, the sensor with a porous PDMS dielectric layer shown in Figure 4b utilizes its microporous structure to reduce the effective modulus, achieving both a wide dynamic range (normal force: 0.05–50 N, shear forces: 0–3.3 N) and high sensitivity (normal: 3800 counts N−1) [66]. A four-electrode layout combined with an active shielding layer suppresses electromagnetic interference and the effects of humid materials. Multidimensional force detection at high sampling rates is achieved through dynamic electrode pairing. Alternatively, Figure 4c shows a sensor utilizing a micro-cone Ecoflex dielectric layer combined with a novel decoupling algorithm to resolve the coupling issue between electrode spacing and overlap area caused by shear forces [81]. Normal forces compress the micro-cones, increasing capacitance, while shear forces cause the top electrode to translate. Output signals are constructed by normalizing capacitance changes, achieving high sensitivity and isotropic response. Furthermore, Figure 4d proposes a three-layer capacitive electronic skin inspired by the microstructure of the skin’s stratum spinosum, featuring a biomimetic “hill” array at the bottom, pyramidal microstructures at the top, and a dielectric layer in between [82]. The pyramidal structures, with gradient height design and spiral arrangement, optimize sensitivity and temporal response. The hill structures induce anisotropic deformation, causing capacitance responses to differ based on the applied force direction. This design achieves a normal pressure sensitivity of 0.19 kPa−1 within 2 kPa, a shear forces sensitivity of 3.0 Pa−1, and millisecond-level response time.
Layering between different sensors is not limited to capacitive sensing. Figure 4e proposes a battery-free, multidimensional wireless tactile sensor (TC-MWTS) that converts mechanical-electrical signals through a three-layer stack [83]. The triboelectric normal force sensing layer (TNFSL) outputs time-domain voltage amplitude, while the capacitive shear forces sensing layer (CSFSL) modulates frequency-domain characteristic frequency. The middle soft elastic layer (MSEL) transmits shear displacement. A contact-discharge triboelectric nanogenerator (CD-TENG) enhances the wireless signal by 25 times, achieving a normal force sensitivity of 2.47 V kPa−1 and shear forces sensitivity of 0.28 MHz N−1. A dual-domain signal decoupling mechanism avoids cross-interference inherent in traditional signal processing. Similarly, Figure 4f mimics the skin-muscle collaborative perception mechanism, constructing an orthogonally stacked system comprising a strain-insensitive pressure subsensor and a pressure-insensitive strain subsensor [62]. The strain-insensitive layer contains MXene-embedded ZnO nanowire arrays (ZOGW) and gradient wrinkle microstructures, achieving a normal pressure sensitivity of 187.71 kPa−1 with strain interference below 3.2%. The pressure-insensitive layer utilizes directionally an aligned segmental polyimide/polyurethane conductive film (AMSPP) to achieve strain direction selectivity (directional selectivity coefficient: 10.74) and a high gauge factor of 863.7. Three independent signal outputs form an “electrical response library” for the quantitative assessment of shear forces direction and magnitude.
In short, the multilayer stacking tactile sensing mechanism achieves physical isolation and collaborative decoupling of multidimensional force signals by vertically integrating functional layers with distinct force-response characteristics and exploiting their differential responses to normal and shear forces. This design ensures high directional recognition accuracy and excellent signal crosstalk resistance, while also optimizing dynamic range, sensitivity, and response speed. In some designs, it further allows for selective perception of force direction and isotropic response.
2.3. 3D Configuration Design
3D configuration design overcomes the limitations of planar structures in force decoupling, anisotropic sensing, and spatial utilization efficiency by arranging and integrating sensing units or force transmission paths within 3D space. Its core principle lies in leveraging spatial dimensions to construct independent or differentiated force response channels. This enables natural physical or electrical decoupling of normal and shear forces. Coupled with specific algorithms, decoupling accuracy can be further enhanced. This design offers excellent force decoupling performance, high directional resolution and sensitivity, while improving spatial utilization efficiency and adaptability. However, its manufacturing processes are generally more complex, structural design presents significant challenges, and signal processing and calibration are also more complex.
Figure 5a presents an embedded hair-elastomer structure, featuring a cross-shaped self-bending piezoresistive cantilever array embedded in an elastomer core [86]. Under normal or shear stress, the deformation at the root platinum resistor of the cantilever exhibits a linear response through a theoretical model. Combined with summation and difference algorithms, it decouples normal and shear stresses with an error below 3%. The detection thresholds for normal and shear forces reach 7.2 Pa and 5.1 Pa, respectively. Subsequently, Figure 5b introduces a modular 3D micro-strain gauge based on micro-electromechanical systems (MEMS) fabrication, utilizing a silica stress layer to drive the self-assembly of nickel-chromium alloy strain gauges into a 3D configuration [87]. This design achieves linear decoupling of normal and shear forces through four orthogonally distributed strain units, with sensitivities of 8.16 × 10−3 N−1 and 1.09 × 10−2 N−1, respectively. It also integrates a planar temperature module for thermal drift calibration.
Figure 5.
3D configuration design of multidimensional force Sensors. (a) Embedded hair-like elastomer cantilever 3D stress decoupling sensor [86]. (b) 3D modular tactile sensor based on MEMS microstrain gauges [87]. (c) 3D multimodal piezoresistive sensor based on buckling assembly [88]. (d) 3D structured electronic skin mimicking the distribution of skin receptors [8]. (e) Soft triaxial force sensor based on liquid metal 3D microfluidic channels [84]. (f) Biomimetic self-healing 3D electrode foam sensor skin with nerve-like distribution [85].
Another 3D structural solution transforms planar precursors into 3D units through buckling-guided assembly [8,88,100,101,102,103,104]. Figure 5c develops a silicon nanomembrane (Si-NM) piezoresistive 3D structure, converting a planar precursor into a “table-shaped” 3D unit via buckling-guided assembly [88]. Four Si-NM piezoresistive elements respond to normal pressure (sensitivity 0.1% kPa−1), shear forces (0.07% N−1), and bending strain, respectively. Temperature sensing is achieved through the temperature coefficient of resistance. This structure supports integration into a 7 × 7 array coupled with Bluetooth wireless circuitry, enabling spatiotemporal force distribution mapping. However, synchronous decoupling of multiple parameters requires additional sensing units. Inspired by bionics, a breakthrough is demonstrated in the 3D architecture electronic skin shown in Figure 5d [8]. Mimicking the spatial distribution of Merkel cells (near the epidermis) and Ruffini endings (in the dermis) in skin, it employs a heterogeneous encapsulation strategy to integrate piezoresistive force and strain sensing units into an eight-arm cage structure and an arched structure, respectively. The force sensing units decouple normal and shear forces through a static mechanical model, with sensitivities of 5 × 10−5 kPa−1 and 6 × 10−4 N−1, respectively, while the strain sensing units independently monitor skin deformation.
Beyond the structures mentioned above, Figure 5e displays a soft triaxial force sensors based on 3D microfluidic channels [84]. Physical isolation of force components is achieved through a geometrically separated microchannel layout: a single bottom channel detects normal force, while six segmented sidewall channels circumferentially measure shear forces. The microchannels are filled with eutectic gallium-indium (EGaIn) liquid metal. When external forces are transmitted through a rigid multi-segment force plate, selective compression of microchannels in different directions causes resistance variations. This design enables mechanical decoupling of normal force (0–35 N) and shear forces (0–13 N), with sensitivity approximately five times higher than planar structures. Moreover, the sensor demonstrates high repeatability (>1000 cycles) and spatial directional recognition capability under combined loading across 12 shear directions and normal forces. To further overcome encapsulation limitations, Figure 5f presents the development of self-healing artificial foam (AiFoam) [85]. This sensor embeds 3D copper electrodes (75 μm in diameter) into a low-modulus self-healing foam substrate (a PVDF-HFP/fluorosurfactant composite), creating skin-like neural network-inspired stereoscopic conductive pathways. When external forces act on the foam surface, normal force uniformly compress the electrode spacing, reducing overall resistance. In contrast, shear forces cause asymmetric deformation at the electrode-foam interface, altering local conductive paths (piezoresistive effect) or electrode spacing (piezocapacitive effect). By designing a topological configuration of four electrode arrays paired with a common ground electrode, the arrays measure regional differences in resistance/capacitance changes. This enables force direction recognition through spatial signal distribution patterns, achieving a prediction accuracy of 88–98%. Compared to traditional planar electrode structures, this 3D nerve-like design not only eliminates encapsulation requirements but also enables dual-mode sensing, with a piezoresistive sensitivity of 0.0982 kPa−1 and a piezocapacitive sensitivity of 0.378 kPa−1.
In summary, multiple tactile sensing mechanisms based on 3D configurations leverage spatial dimensions to construct independent or differentiated force response channels, enabling efficient and highly accurate natural decoupling of normal and shear forces as well as anisotropic perception. This design not only ensures exceptional force decoupling performance, high directional resolution/sensitivity, and spatial utilization efficiency but also significantly optimizes the sensor’s adaptability, multifunctional integration capability, and perception dimensions, overcoming the inherent limitations of planar structures.
2.4. Split-Type Structure Design
The core of the split-type structure design lies in introducing an intermediate physical field as a conversion medium to achieve functional decoupling and physical separation between the sensing and response layers. The structure typically consists of three layers: an intermediate field layer, a buffer layer, and an electrical response layer. The intermediate field layer converts multidimensional force-induced deformation into changes in the physical field, the buffer layer supports deformation while isolating and protecting other components, and the electrical response layer detects these physical field changes and converts them into electrical signals. By using an intermediate physical field as an information carrier, this design decouples mechanical deformation from electrical signal generation. Key advantages include a highly simplified structure, inherent self-decoupling of multidimensional forces through the intermediate field (reducing reliance on complex algorithms), and compatibility with thin-film device formats. However, the approach is susceptible to environmental interference, requires precise control and stability of the intermediate field, and may involve complex image processing or optical systems in certain implementations—leading to relatively higher power consumption and computational costs. Compared to multilayer stacking design, the split-type strategy uses an intermediate physical field as a conversion bridge rather than direct stacking of sensing units. It shifts decoupling complexity to the design and control of the intermediate field and its resilience to environmental factors. In short, whereas multilayer stacking emphasizes physical isolation and coordinated sensing, the split-type design focuses on field conversion and functional separation.
Magnetoelectric multidimensional sensors transform force signals into changes in the surrounding magnetic field via deformation of a magnetic film. This magnetic field variation is then converted into an electrical signal by a magnetoelectric response layer. This phenomenon is known as the magnetoelectric coupling effect, wherein applying a magnetic field can alter the material’s electric polarization state (generating voltage), and conversely, applying an electric field can also change the material’s magnetization state. For instance, Figure 6a depicts a soft magnetic skin sensor based on the magnetoelectric mechanism [105]. It employs a sandwich structure: the intermediate field layer is a sinusoidally magnetized flexible magnetic film that deforms under external force, altering the magnetic field distribution; the buffer layer is a silicone elastomer transmitting deformation; and the electrical response layer is a printed circuit board (PCB) integrated with Hall sensors, enabling force perception by detecting magnetic flux changes. Leveraging a specialized magnetization design in the magnetic film, this sensor achieves self-decoupling of normal and shear forces along two axes. The magnetic soft tactile sensor in Figure 6b further optimizes the magnetization design of the intermediate field layer based on the three-layer structure [106]. It approximates an ideal sinusoidal magnetization distribution through a folded magnetization pattern, enabling triaxial force decoupling. Figure 6c utilizes two orthogonally overlapped, sinusoidally magnetized flexible magnetic films as the intermediate field layer, reducing calibration complexity from N3 to 3N and enhancing overall device performance and applicability [58]. The design strategy of vertically stacked, decoupled layers—magnetic intermediate field, buffer layer, and magnetoelectric response field—has inspired numerous subsequent developments [107,108,109,110]. Figure 6d modifies the relative positioning within the three-layer framework [90]. Here, the electrical response layer comprises outer metal coils, while the intermediate field is generated by an embedded NdFeB magnet. Force application induces changes in the magnetic flux through the coils, generating an induced voltage for 3D force sensing without requiring an external power supply.
Meanwhile, the development of photosensitive technology has inspired researchers to convert mechanical signals into optical signals, leading to the development of various optical-integrated stress sensors [111,112,113,114]. One category of optical multidimensional sensors converts multidimensional signals into changes in the light field passing through surface reflection films or projection films via the deformation of these films. In the electrical response layer, cameras or optoelectronic device arrays are typically used to transform changes in optical signals into electrical signals, which is similar to the aforementioned three-layer magnetoelectric sensors. Figure 6e illustrates a typical reflective optical tactile sensor [91]. Its intermediate field layer is a translucent gel that, upon force-induced deformation, alters image brightness distribution via changes in light reflection. The buffer layer is light-transmissive silicone rubber and the electrical response layer consists of a wide-angle camera and a light-emitting diode (LED) backlight module. This setup reconstructs 3D objects and their applied forces by analyzing brightness variations in the gel. The sensor in Figure 6f employs porous PDMS rubber as its core intermediate field layer, exploiting force-induced pore closure to modify light scattering properties [115]. The buffer layer is integrated within this intermediate layer, yielding a combined thickness of 800 μm. The electrical response layer comprises a flexible organic photodiode (OPD) array and a flexible backlight layer, representing a second archetype of the decoupled optical sensing structure. This configuration significantly reduces the overall thickness of the optical tactile sensor to a mere 1.5 mm.
In summary, novel tactile sensing mechanisms, such as magnetoelectric and optical approaches, introduce intermediate fields (magnetic or optical). This decoupling of the sensing layer from the response layer ensures high performance in multidimensional perception while significantly simplifying the structural design of the sensors.
Figure 6.
Split-type structure design of multidimensional force sensors. (a) Tactile sensor decoupling normal and shear forces based on 2D magnetization [105]. (b) Triaxial force-decoupled tactile sensor based on four-direction magnetization [106]. (c) Triaxial force-decoupled tactile sensor based on multi-layer magnetization stacking [58]. (d) Magnetoinductive-based triaxial force-decoupled tactile sensing fingertip [90]. (e) Multidimensional force tactile sensor based on light reflection principle [91]. (f) Multidimensional force tactile sensor based on light transmission principle [115].
2.5. Other Structure Design
Beyond the primary sensing structure categories discussed above, several studies have developed other multidimensional sensors with unique architectures aimed at optimizing performance [116,117,118]. These designs achieve multidimensional force tactile perception by introducing innovative non-traditional architectures, often centered on optical pathways or biomimicry. Their core principle involves utilizing force-induced changes in material optical properties or geometric deformations as sensing signals. These alterations are captured via cameras or photodetectors and analyzed to extract multidimensional force information. This approach offers high resolution, excellent force-decoupling capability, and functional versatility. However, it also entails high system complexity, susceptibility to ambient light interference, relatively complex signal processing, and greater difficulties in encapsulation and calibration. Compared to 3D Configuration Design, both approaches move beyond planar structures, but they differ fundamentally in decoupling mechanism and sensing philosophy. The 3D configuration design relies on the physical arrangement of sensing elements in three-dimensional space to create natural decoupling channels—whether physical or electrical—where force signals are directly measured via changes in electrical parameters. Spatial structure is central to its decoupling ability. In contrast, other structure designs derive decoupling primarily from the analysis of optical signals rather than from the specific 3D arrangement of sensing elements. Forces are first converted into changes in optical characteristics, which undergo photoelectric conversion and algorithmic interpretation to obtain multidimensional information. Structural innovation in this category focuses more on enabling specific optical interactions or biomimetic functional units.
The sensor structure shown in Figure 7a incorporates a camera, a variable-resistance force tactile array, red green blue light-emitting diodes (RGB LEDs), and an epoxy resin housing [119]. The variable-resistance array utilizes multiplexed analog switches for row/column signal isolation and is overlaid with linen canvas acting as a common sensing layer. Internally, it houses a high-resolution camera and RGB LEDs. This design enables the perception of force-induced deformation via the resistive array and multidimensional sensing through visual imaging by the camera. Moving towards bio-inspiration, the sensor structure in Figure 7b features an elastomer layer with randomly distributed color blocks, drawing inspiration from biological compound eyes [120]. Force-induced deformation alters the light reflection characteristics of this layer. A complementary metal-oxide-semiconductor (CMOS) image sensor captures localized deformations via pinhole imaging, enabling multidimensional perception. The visual units are arranged on a curved surface, with each unit optically isolated by opaque walls to prevent crosstalk. This design significantly reduces the error rate in identifying normal and shear forces. Building upon the fusion of sensing modalities and further extending the concept into multispectral imaging, the sensor in Figure 7c integrates imaging across multiple spectral bands—visible light, near-infrared, and mid-infrared—through structural design [121]. This fusion enhances the effective temperature range and accuracy for multidimensional perception and enables proximity sensing. Its structure employs a one-way vision latex film that achieves selective light transmission by controlling light intensity on either side: Under visible light (external brighter than internal), the film is transparent for proximity sensing; under near-infrared (NIR) illumination, the film becomes opaque for multidimensional force deformation detection. The mid-infrared (MIR) channel incorporates a temperature sensor for thermal perception.
Based on the pinhole camera principle, the sensor in Figure 7d offers a miniaturized solution for arrayed tactile perception [96]. A diffuser disk within a silicone pillar changes its relative position to the pinhole under force. A four-quadrant photodiode beneath the pinhole detects the projected light spot signal, identifying its xy-axis displacement for multi-directional shear forces perception. Triaxial forces are then identified via algorithmic regression mapping [122]. Mimicking the interlocked epidermis-dermis structure, the sensor in Figure 7e features “tip-dome” units [123]. A central white tip is surrounded by four black dome tips, separated by a narrow gap of only 0.071 mm filled with incompressible gel for rapid reset. Multidimensional identification involves calculating the contact center using image moments and estimating shear forces using a dynamic friction model. Unlike traditional marker-based optical sensors, this design eliminates the need for complex marker detection, enabling direct force analysis from raw image intensity. The optical-based multipoint sensor in Figure 7f utilizes an 800 μm thick porous rubber as the force-sensitive layer [124]. Force application closes pores, altering light scattering properties. A flexible imager and a backlight layer form the detection array. Multidimensional identification is achieved by analyzing differences in light intensity attenuation across the array, enabling 3D pressure distribution mapping over a 3 cm × 4 cm area. This design achieves a large force sensing range.
These uniquely structured sensors, through innovations in optical path design and biomimicry, achieve performance enhancements over traditional optical tactile sensors in terms of resolution, response speed, and functional diversity. They provide valuable inspiration for future optimization strategies in the structural design of multidimensional force tactile sensors.
To comprehensively and intuitively evaluate and compare the performance differences among sensors based on the aforementioned five sensing mechanisms, Table 2 summarizes their core principles, typical performance metrics, and representative application scenarios.
Figure 7.
Other structure design of multidimensional force sensors. (a) Multidimensional force tactile sensor based on multi-color optical cavity [119]. (b) Multidimensional force tactile sensor designed using pinhole imaging principle [120]. (c) Multidimensional force tactile sensor based on multi-color reflection superposition method [121]. (d) Multidimensional force tactile sensor based on binocular disparity principle [96]. (e) Multidimensional force tactile sensor based on light interference principle [123]. (f) Multidimensional force tactile sensor based on multi-directional light transmission principle [124].
3. Sensing Fusion of Multidimensional Force and Other Tactile Modality
Tactile perception, as a core capability for human–machine interaction and environmental exploration, encompasses far more than mere multidimensional force detection. It integrates critical information modalities such as temperature, material properties, and surface roughness [125,126]. In biological systems, force sensing is intrinsically coupled with other modalities—for instance, mechanoreceptors and thermoreceptors in human skin work synergistically to perceive touch, temperature, and vibration, enabling nuanced interaction with the environment. To equip robots or intelligent systems with comprehensive environmental sensing capabilities approaching or even surpassing human levels, researchers are dedicated to developing multimodal sensors capable of simultaneously capturing these complex signals. In this pursuit, bio-inspired design principles—mimicking the sophisticated structure and multifunctional integration of human skin to organically fuse diverse sensing components—have been widely validated as a highly promising technological pathway [127,128,129,130,131].
Figure 8a illustrates a vertically stacked multimodal tactile sensor [132]. Its base-layer pressure-sensing array employs MXene/lotus nanofiber/polystyrene microsphere (MXene/LNF/PS) composite microporous film. Four planar distributed units detect contact pressure distribution, achieving linear response within a 30 kPa range and triaxial forces decoupling. The top layer co-integrates a triboelectric nanogenerator (TENG) and a capacitive sensor to form dual-modal extensions. Operating in single-electrode mode (30 ms response time), the TENG captures dynamic contact behaviors through contact-separation charge transfer. The spiral-electrode capacitive sensor detects approaching objects within 5 cm via electromagnetic field perturbations, overcoming the contact-only limitation of conventional multidimensional force sensors. By fusing these three multimodal signals, the sensor achieves 95% recognition accuracy for multi-feature targets with machine learning assistance, resolving the perception constraints of single-modal systems in complex environments. Similarly focused on multimodal perception, Figure 8b presents a triboelectric-effect-based finger-shaped tactile sensor (FTS) that achieves efficient integration of multidimensional force detection and material identification [133]. The sensor adopts a biomimetic finger structure, integrating an external material identification module with an internal force sensing module. The external module embeds three single-electrode triboelectric sensors composed of distinct materials at the silicone-coated fingertip region. Upon contacting objects, material-specific voltage signatures generated by material-surface interactions are processed through a ResNet50 deep learning model, enabling high-precision recognition of 12 materials with 98.33% accuracy. Internally, the force-sensing module employs a three-dimensionally structured spatial layout. Five silver electrodes distributed on silver-coated polylactide (PLA) skeletons collaborate with microneedle arrays to resolve 3D force magnitude and direction through friction signal variations from local contact-separation events, achieving a force resolution of 0.01 N.
Figure 8.
Multidimensional force and multimodal sensors. (a) Force-material integrated sensing based on a multimodal tactile sensing system [132]. (b) Multidimensional force-material simultaneous sensing using finger-shaped triboelectric tactile sensors [133]. (c) Multimodal tactile encoding and decoupling for optoelectronic robotic skin [134]. (d) Robot multimodal tactile sensing module based on planar hybrid architecture [135]. (e) Multimodal artificial receptor driven by ion relaxation dynamics [66]. (f) Skin-integrated multimodal tactile feedback Interface [136].
Distinct from the aforementioned electrical principles, Figure 8c presents an innovative optical encoding method [134]. The proposed optoelectronic “robotic flesh” integrates multidimensional mechanical information with non-mechanical modalities within a flexible structure through a novel optical encoding mechanism. This sensor adopts a multi-layer elastomer architecture. The outermost light-shielding layer isolates ambient light, while the innermost silicone rubber layer encapsulates stretchable optical waveguides to form a nerve-like network. The central functional core comprises a thermochromic gel layer and a light-transmissive silicone layer. Collaborating with the optical waveguides, the transparent layer encodes normal pressure as localized light intensity enhancement, whereas shear forces are identified through asymmetric light intensity distribution patterns across the waveguide array. Contact localization achieves 1.25 mm precision with merely 0.32 N error in normal force measurement. Embedded thermochromic dyes convert temperature into characteristic wavelengths, transmitted via waveguides and decoded by RGB sensors. This yields a 1.12 °C error within the 15–60 °C range, while remaining decoupled from mechanical signals. Combining physical optical encoding with machine learning-based decoding, this design demonstrates excellent performance in mechanical and thermal sensing. It further enables damage detection and gesture recognition through light intensity feature analysis. Unlike this physical encoding approach, Figure 8d illustrates a skin-integrated wireless tactile interface [135]. Its core structure consists of an array of multimodal touch units, with each unit integrating three independently driven functional layers. The thermoelectric layer employs a heat dissipation structure combining thermally conductive films with ceramic heat-insulating fibers to enhance thermal response speed. The mechanical and electrotactile layers operate jointly to overcome single-modal frequency limitations. The mechanical actuator compensates for the high-frequency perception shortcomings of electrotactile feedback, while electrotactile stimulation enhances low-frequency vibration resolution. Together, they synergistically replicate complex tactile sensations like sliding friction and transient impacts. Additionally, the array design supports spatial dynamic programming. By combining pressure signal models with thermal conduction simulations, the system can simultaneously map multidimensional force distributions and material properties (such as the cold sensation of metal or warm sensation of fabric). Ultimately, this achieves multimodal fusion of temperature-force-texture information in VR systems. This structural innovation provides a scalable framework for integrating multidimensional force feedback with thermal and electrical modalities.
Focusing on synchronous high-precision detection of triaxial forces and temperature, Figure 8e developed a planar-configuration multimodal tactile sensing module [74]. The sensor comprises a 4 × 4 array of triaxial force sensing units and a single-channel temperature sensor. The force sensing units utilize inorganic single-crystal silicon nanomembrane strain gauges, combined with a polymer-based PDMS deformation layer and trapezoidal bump structures. Triaxial force measurement is achieved through strain distribution decoupling, offering a vertical force resolution of 10 mN and a crosstalk error of <4%. The temperature sensor leverages the linear resistance-temperature relationship of serpentine gold electrodes; its meandering design effectively suppresses strain interference induced by substrate deformation. This hybrid structural design simultaneously enables high-precision triaxial forces detection and temperature perception. Furthermore, it integrates vibration analysis to enhance the robot’s dynamic recognition capability of object manipulation states. Figure 8f proposes a multimodal ionic electronic skin (IEm-skin) [136]. This sensor achieves synchronous multi-modal perception of multidimensional force and temperature through an innovative electrode-ion conductor-electrode sandwich structure. Its core lies in utilizing the dual-variable decoupling principle based on ionic relaxation dynamics. Ionic relaxation dynamics describes the process by which minute charged particles (ions) within a material rearrange from a disordered state and stabilize into a new state when stimulated by an external electric field. Absolute temperature detection is realized by measuring the charge relaxation time as an intrinsic variable insensitive to strain. Simultaneously, the normalized capacitance serves as an extrinsic variable insensitive to temperature to resolve strain distribution. Building on this, the design further expands multidimensional force detection capabilities through arrayed pixel arrangement (10 × 10 AMI receptor units) and low-friction interfacial encapsulation. The in-plane strain response of the ionic conductor maps planar mechanical distributions (e.g., shear, tension), while the temperature field identifies pressure location and directional vectors via contact-point heat conduction.
In summary, these studies, through diverse material systems, structural designs, and sensing mechanisms, have significantly enhanced the multimodal perception capabilities and integration level of tactile sensors. They can not only accurately detect physical quantities such as pressure, multidimensional force, and temperature, but also achieve advanced functions like proximity sensing, material identification, damage detection, and reproduction of complex tactile sensations. By leveraging machine learning or physical principles, they effectively decouple signals and fuse information. These advances provide crucial technological support for the next generation of intelligent robots, human–machine interfaces, and wearable devices, driving the evolution of machine touch towards a more intelligent, biomimetic, and syncretic direction.
4. System Integration with AI for Intelligent Applications
With synergistic advances in structural design and materials engineering, multidimensional force sensors can now not only distinguish normal and shear forces with high resolution, but also simultaneously detect external stimuli such as temperature and vibration through multimodal integration. This significantly enhances their adaptability and application potential in complex environments. Furthermore, the output signals from these sensors can be processed using artificial intelligence (AI) algorithms to support deeper data analysis for intelligent applications. Machine learning techniques include Support Vector Machines (SVM), Random Forests, and K-Nearest Neighbors (KNN), while deep learning encompasses Convolutional Neural Networks (CNN), Recurrent Neural Networks (RNN), and variants such as Long Short-Term Memory (LSTM). A typical data processing workflow involves raw signal acquisition, data preprocessing, feature extraction, model training and validation, and finally, model deployment and inference. Different algorithms are suited to different data types and scenarios. For example, CNNs excel at extracting spatial features from high-dimensional tactile images for object recognition [137,138]. RNNs and LSTMs are well-suited for processing temporal sequences such as vibration or sliding signals, enabling state monitoring or texture recognition [139,140]. Traditional algorithms like SVM remain advantageous in cases with limited samples or where interpretability is essential [140,141]. By analyzing the multidimensional, multimodal data streams produced by these sensors, AI algorithms can uncover subtle patterns, correlations, and distinctive features within the data. This supports intelligent classification—such as material identification, defect detection, and gesture recognition—as well as predictive decision-making, including grasp force adjustment, anomaly warnings, and interaction intent recognition.
However, it is important to acknowledge the limitations of current AI models in tactile sensing applications. Many deep learning approaches, particularly CNNs and RNNs, are data-hungry and require large amounts of labeled training data to achieve robust performance, which can be challenging and time-consuming to acquire in physical sensing environments. Furthermore, these models often face generalization issues: a model trained under specific conditions (e.g., particular sensor hardware, environmental settings, or user characteristics) may perform poorly when deployed in different contexts. Issues such as sensor drift, material variations, and changing contact conditions can further degrade model reliability. Additionally, many complex AI models, especially deep neural networks, suffer from limited interpretability, making it difficult to understand the underlying decision-making process—a significant drawback in safety-critical applications. There is also often a trade-off between model complexity and computational efficiency, which can hinder real-time deployment on embedded systems with limited resources. Despite these challenges, ongoing research in areas such as transfer learning, few-shot learning, and explainable AI is helping to mitigate these limitations. By addressing these issues, future AI-integrated tactile systems can become more efficient, adaptable, and trustworthy.
The rapid advancements in AI drive the development of more sophisticated intelligent applications. This section highlights representative applications of these AI-empowered multidimensional force sensors in intelligent robotics, human–machine systems, and smart industry and agriculture, demonstrating their critical role in enabling AI-driven precise manipulation, enhanced interactive experiences, and the advancement of automation capabilities.
4.1. Intelligent Robotic Manipulation and Cognition
As robotic operational capabilities in complex and dynamic environments continue to advance, the demand for integrated intelligent tactile systems in humanoid robots is growing rapidly. Multidimensional force sensors can accurately measure and distinguish between normal and shear forces, thereby enabling robotic systems to achieve multidimensional force perception comparable to that of human fingers. The primary applications of such multidimensional force sensing technologies in robotics include intelligent grasping [142,143], material property recognition [144,145], and omnidirectional environmental force perception [146,147].
In intelligent grasping applications, adaptive gripping and dynamic force control are achieved through the real-time analysis of spatial distributions of normal and shear forces [148,149]. The efficient decoupling capability of multidimensional force sensing plays a crucial role in practical implementation. As shown in Figure 9a, during the adaptive grasping task of an egg, this sensor achieved collaborative control against slipping and crushing through real-time monitoring of triaxial contact forces [150]. When external shear forces exceed the preset friction threshold, the gripper automatically increases clamping force. When the normal force Fz surpasses a critical value, the gripper actively releases pressure to prevent object damage. This closed-loop strategy based on force feedback resolves the conflict between sensitivity and robustness in traditional grasping, significantly enhancing robotic manipulation capabilities for flexible targets. In the coffee-making task, human operators generate six-dimensional force/torque signals through tactile demonstration, enabling the robot to learn complex operational sequences such as closing the gripper, pouring water, and stirring [58]. Signals from the teaching phase are converted into robotic motion trajectories, autonomously reproduced during the imitation phase, forming a closed loop of “hands-on” teaching and unsupervised execution. This design lowers the barrier to robot programming and provides a new paradigm for rapid task deployment in adaptive production lines.
For object characterization, multidimensional force sensing technology has evolved from surface roughness detection toward quantitative elastic modulus evaluation [151,152]. Figure 9b shows the sensor array schematic of the 3D-architectured electronic skin (3DAE-Skin) system [8]. It is capable of simultaneously predicting the elastic modulus and principal curvature components of an object. When integrated with DNN modeling, the system can rapidly identify key attributes such as fruit ripeness and bakery freshness, regardless of the target’s geometry, demonstrating a comprehensive assessment capability that closely approximates human tactile perception.
Figure 9.
Multidimensional force sensing-enabled enhanced perception and manipulation of intelligent robotic. (a) Self-decoupling soft triaxial force tactile sensor enabling adaptive robotic grasping and coffee preparation [58,150]. (b) 3D-Architectured electronic skin for simultaneous quantification of elastic modulus and principal curvature components of objects [8]. (c) Split-type magnetic soft tactile sensor with triaxial forces for fluid velocity measurement and navigation [106]. (d) Flexible biomimetic whisker sensor enabling multidimensional perception and autonomous obstacle avoidance [153]. (e) Multi-axis force feedback using fringing field effect capacitive sensors in robot-assisted surgery [154].
In the field of omnidirectional environmental force sensing, multidimensional force sensors enable real-time detection of spatial force variations in complex environments through the high-precision decoupling and recognition of normal and shear forces. Their integration into various flexible mechanical systems has facilitated broad applications in directional perception, obstacle detection, and dynamic fluid feedback. Figure 9c designs a magnetic soft tactile sensor inspired by the lateral line structure of a bionic fish, employing a “touch layer–buffer layer–transmission layer” architecture [106]. This sensor, when mounted on both the inner and outer surfaces of a hull, can measure relative velocity and directional shifts in surrounding objects in the water, demonstrating the significant potential for non-contact environmental navigation. Figure 9d develops a flexible bionic tactile sensor (FBTS) with a whisker-like signal amplifier and beam structure [153]. Deployed on the head of a bionic rat, spatial force direction is rapidly identified via multi-channel resistance decoupling, thereby enabling the system to autonomously navigate and avoid obstacles. Figure 9e proposes a multi-axis capacitive tactile force sensor based on the fringing-field effect [154]. The sensor was integrated into the tip of a robotic surgical arm to provide tactile force feedback to the surgeon, effectively enhancing force resolution in complex surgical procedures.
4.2. Human–Machine Interaction and Wearable Health Monitoring
Compared to traditional sensors, multidimensional force sensors offer more comprehensive and realistic tactile information, demonstrating significant potential in complex human–computer interaction systems [108,155,156,157,158,159]. In basic augmented reality/virtual reality (AR/VR) interaction and orientation control, Figure 10a proposes a triaxial iontronic sensor with a hierarchical interlocked structure [60]. This sensor can be integrated into a virtual platform, allowing users to control virtual characters or pointers through directional force inputs, demonstrating high real-time performance and directional responsiveness. Extending to motion capture and analysis, Figure 10b develops a biomimetic skin-muscle multi-axis force sensor [62]. This device enables real-time detection of the direction and intensity of applied wrist forces during basketball shooting, providing direct feedback to correct athletic posture. At the interface level, Figure 10c shows a multidimensional optical flexible sensor based on a U-shaped micro-nano fiber (MNF) waveguide [160]. The sensor-enabled the construction of a low-power, high-precision virtual mouse interface, using shear forces for direction control and normal force for click triggering. These applications highlight the core advantages of multidimensional force sensors in enhancing interaction accuracy, enabling real-time feedback, and expanding the scope of multimodal sensing.
AI-driven digital contact tracing has become a critical component of pandemic healthcare. Wearable multidimensional haptic sensing technology can further enhance this AI integration. For instance, when robots perform throat swab or nasal swab sampling, multidimensional force feedback ensures safe and precise sampling while linking AI contact tracing with accurate biomechanical data [161,162]. Wearable multidimensional haptic sensing technology is increasingly becoming a critical enabler for precision healthcare and personalized medicine, owing to its ability to provide comprehensive biomechanical feedback [52,55,163,164,165]. Expanding into joint and prosthetic interface mechanics, Figure 10d designs a flexible magnetic self-decoupled triaxial e-skin employing an orthogonally magnetized pattern structure, which was used for sensing triaxial force distributions in artificial knee joints and for monitoring interaction forces at the brace-skin interface [58]. This system provides essential data for rehabilitation and prosthetic optimization. In the high-precision, multi-axis field of orthodontics, Figure 10e introduces a flexible six-axis force (x-, y-, z-axis force and x-, y-, z-axis torque) sensor inspired by a mortise-and-tenon joint structure [166]. Through its concave-convex interlocking design and integration with a deep neural network, it enables real-time monitoring of multiaxial orthodontic forces when embedded in dental braces, thereby assisting in treatment optimization. Ultimately, the integration of multidimensional force sensing enables deep pathological assessment. Figure 10f develops a bimodal tactile sensing system that fuses piezoelectric and piezoresistive modules to synergistically detect static and dynamic three-dimensional stimuli and identify material properties [80]. When integrated into a robotic manipulator, the system achieved real-time palpation and identification of pathological changes in porcine esophageal tissue with an accuracy of 98.44%, providing a reliable mechanistic basis for disease assessment.
Figure 10.
Multidimensional force perception of human–machine interaction and wearable healthcare. (a) A triaxial soft iontronic sensor mounted on the back of the subject’s hand, providing the input direction information for the game control [60]. (b) Use of the multidimensional sensor for free-throw guidance [62]. (c) Triaxial force-sensing human–machine interface for real-time multifunctional cursor control [160]. (d) Knee force distribution measurement and robot teaching using the triaxial force tactile sensor [58]. (e) Flow chart of application of flexible six-axis force sensor in orthodontics compared with pressure sensor [166]. (f) Medical application of clinical feature identification by the bimodal haptic sensor and the interactive interface [80].
4.3. Agricultural and Industrial Inspection Automation
Multidimensional force sensors provide robots with abundant force signals to assist manipulators in achieving stable and safe grasping and have extensive application value in fields such as industry and agriculture, with typical scenarios including industrial inspection and agricultural picking.
In agricultural automation, the maturity and damage degree of crops such as vegetables and fruits are reflected in tactile information such as their hardness and surface roughness. Therefore, effective multidimensional feedback is particularly important for crop recognition. Moreover, some fragile crops such as raspberries and strawberries require more sensitive and soft contact and force feedback information to assist in grasping [167,168,169]. Figure 11a shows typical applications in this scenario [80]. The soft texture dual-modal tactile sensor achieves accurate differentiation and recognition of mature or rotten white strawberries through the synergistic effect of the piezoelectric layer and the piezoresistive layer, combined with machine learning training on the multidimensional signals collected during picking. As shown in Figure 11b, the robot arm equipped with the developed sensor can detect the surface texture and hardness of fruits in real time [82]. During the gripping process, it monitors slipping and falling through multidimensional signals and dynamically adjusts the gripping force. This significantly reduces the damage rate of raspberries during transportation and handling.
In the industrial field, normal pressure sensors already have some mature application schemes [170,171,172,173,174]. Multidimensional force sensors, which expand force perception to 3 or even 6 dimensions, can effectively help manipulators complete high-difficulty actions such as grasping posture recognition and component alignment, providing data support for more general industrial applications. In typical application scenarios such as garbage classification shown in Figure 11c, a quadruple tactile sensor can distinguish 13 different material states and identify 7 common types of garbage including recyclable and non-recyclable ones [175,176]. By equipping with multidimensional force sensors, optimizing the spatial distribution perception of force signals, and identifying the specific posture of garbage, the interference of complex scenarios such as tilted garbage stacking on recognition can be reduced, further improving the adaptability and stability of multidimensional force sensors in actual garbage classification. Figure 11d shows that in the automated operation scenario of smart factories, after integrating multidimensional force sensors, the manipulator can assist in identifying the external local attributes, internal attributes, and global attributes of objects through differences in force signals during contact, which greatly improves the accurate recognition and efficient operation of objects [177]. It highly relies on perception ability, thereby enhancing the reliability and efficiency of smart factories. A specific example is shown in Figure 11e, where the direction of the shear forces from fingertip touch is fed back in real-time through a magnetic flexible shear forces sensor to identify the movement direction of the fingertip, which can be applied to touch screen operations and fine handling of objects in industrial scenarios [178]. Furthermore, in industrial production line scenarios, multidimensional force devices assist robots in completing precise alignment, fitting, and force-controlled assembly tasks of components [179,180,181], demonstrating the broad potential of multidimensional force devices and systems in industrial fine assembly scenarios.
Figure 11.
Multidimensional force sensing in multi-scenario applications of agricultural and industrial automation detection. (a) Soft texture dual modal tactile sensor and its application in picking white strawberries [80]. (b) Biomimetic electronic skin with layered patterns and force feedback during transporting raspberries [82]. (c) The role of quadruple tactile sensors in garbage classification and recognition [175,176]. (d) The auxiliary role of multidimensional force tactile sensors in the recognition of object related attributes by robotic hands [177]. (e) Application of magnetic flexible shear forces sensor in touchscreen operation) [178].
5. Summary and Outlook
This review systematically explores recent advancements in multidimensional force tactile sensors—an active research area with strong scientific and engineering relevance for cutting-edge applications such as intelligent robotics, human–machine interaction, and precision manipulation. It examines the core structural strategies of five distinct sensor designs, analyzes their unique advantages in improving decoupling accuracy, sensitivity, and robustness, and reviews how multimodal fusion technologies extend sensory perception beyond force alone to enable richer environmental understanding. The application potential of these sensors is also discussed across key domains including dexterous robotic manipulation, wearable health monitoring, human–machine interaction, and precision industrial and agricultural automation. Despite these advances, current artificial multidimensional force sensing systems still lag significantly behind biological skin in terms of sensing dimensions, spatial resolution, information processing efficiency, and closed-loop intelligence. Achieving next-generation tactile sensing—with bioinspired or even skin-surpassing capabilities—presents ongoing challenges and opportunities. Future developments will likely focus on several key directions to enable a comprehensive leap from fundamental sensing to integrated system intelligence.
As shown in Figure 12, first and foremost lies the deep multimodal fusion, entailing the expansion and synergization of sensing dimensions. Current multidimensional force sensors demonstrate strong capabilities in precisely resolving triaxial forces, particularly in parsing the magnitude and direction of spatial force vectors. However, the technology for accurate, robust, and low-coupling resolution and description of complete six-axis forces remains immature. Simultaneously, the inherent manufacturing complexity of multidimensional force sensors and the difficulty of force decoupling persist. Even in systems claiming multi-axis capability, many exhibit significant cross-axis interference or practical accuracy limitations [182]. While some research achieves high-precision triaxial force resolution, others possess only basic triaxial measurement capabilities, lacking robust decoupling mechanisms or high signal-to-noise ratios [183]. These variations typically stem from manufacturing inconsistencies, material nonlinearities, complex microstructure designs, and limitations in calibration methods—all contributing to unreliable performance in practical applications. Future efforts must not only pursue higher-level integration of sensing modes but also address fundamental issues like manufacturability and decoupling reliability. This represents the core focus and challenge of current research. Primary future directions include consolidating and enhancing the accuracy, robustness, and decoupling performance of six-axis force perception. Building on this, we aim to go beyond singular mechanical perception and achieve deep multimodal fusion. While researchers have proposed design strategies integrating temperature, strain, and texture sensing [76,184,185,186,187,188], seamless integration of additional modalities—such as humidity, vibration, slip detection, and even chemical composition perception—remains necessary. Layering this fusion onto reliable, high-precision six-axis force perception (as the core), akin to human skin—which integrates multiple sensations—will provide richer, more realistic interaction information. To achieve this, future research can focus on resolving: compatibility of heterogeneous sensing materials, signal decoupling algorithms under multiphysics coupling, spatial layout optimization under high integration, and low-power, high-reliability packaging technologies.
Next is high-density spatial integration, which enables upgrading the dimensionality of sensing resolution. Emulating the high-density distribution of receptors in human skin involves not merely increasing the number of sensing units per unit area, but also encompasses optimizing array structural design, achieving breakthroughs in ultra-flexible/stretchable substrates and interconnection technologies, enhancing micro/nano-fabrication precision, and developing efficient, low-cost manufacturing processes [82,189,190,191,192,193,194]. The core challenges lie in overcoming signal crosstalk under high-density configurations, reducing power consumption of large-scale arrays, improving data readout efficiency, and ensuring mechanical stability and electrical reliability under large deformations.
Then comes the in-sensor computing, which enables a revolutionary leap in information processing efficiency. Faced with the massive data generated by high-density, multimodal sensors, the traditional “sensing-transmission-centralized processing” model encounters bottlenecks in bandwidth, latency, and power consumption. The future breakthrough lies in the exploration and implementation of the in-sensor computing architecture. The core idea is to preprocess raw data, extract features, and even make preliminary decisions at or near the sensor end, significantly reducing redundant data transmission and lowering system latency and energy consumption. This requires the deep integration of novel neuromorphic electronic devices, analog computing circuits, in-memory computing, and advanced edge AI algorithms with sensing hardware [195,196,197,198,199]. Edge AI algorithms are a type of algorithm that brings AI data processing capabilities from distant cloud servers down to the device itself or to local computing units near the device. Their core advantages are efficiency, low latency, and privacy protection.
The ultimate goal of system intelligence is embodied intelligence. The ultimate vision of multidimensional force sensing technology is to provide an indispensable physical engagement perception loop for embodied intelligence. Embodied intelligence emphasizes that agents learn, adapt, and evolve through sustained environmental interaction. Within this framework, multidimensional force sensors must be deeply integrated into the agent’s perception-decision-action loop, serving not merely as passive information collectors but as the sensory foundation for agents to comprehend physical laws, master manipulation skills, achieve human–robot safe collaboration, and proactively explore environments. Realizing this vision demands convergence of multidisciplinary expertise across robotics, neuroscience, materials science, and AI.
Author Contributions
Conceptualization: J.C., M.X., P.C. and Q.S.; methodology: J.C., M.X., P.C. and Q.S.; validation: J.C., M.X., P.C., B.C. and H.C.; formal analysis: B.C., H.C. and X.X.; investigation: J.C., M.X., P.C., B.C. and H.C.; resources: J.C., M.X., P.C., B.C. and H.C.; data curation: J.C., M.X., P.C., B.C. and H.C.; writing—original draft preparation: J.C., M.X., P.C., B.C. and H.C.; writing—review and editing: J.C., M.X., P.C., B.C. and H.C.; visualization: J.C., M.X., P.C., B.C. and H.C.; supervision: X.X., J.W. and Q.S.; project administration: X.X., J.W. and Q.S.; funding acquisition: J.W. and Q.S. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by the National Natural Science Foundation of China (62301150 and 62075040); the National Key R&D Program of China (2022YFB3603403 and 2021YFB3600502); the Southeast University Interdisciplinary Research Program for Young Scholars (2024FGC1007); the Fundamental Research Funds for the Central Universities (2242025F10007); and the Start-up Research Fund of Southeast University (RF1028623164).
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
No new data were created or analyzed in this study. Data sharing is not applicable to this article.
Conflicts of Interest
The authors declare no conflicts of interest.
Abbreviations
The following abbreviations are used in this manuscript:
3D | Three-Dimensional |
CNTs/PDMS | Carbon Nanotube/Polydimethylsiloxane |
RSHTS | Rigid-Flexible Hybrid Piezoelectric Tactile Sensor |
PVDF | Polyvinylidene Fluoride |
ZnO | Zinc Oxide |
TFTs | Thin-Film Transistors |
PET | Polyethylene Terephthalate |
H-E | Hemispherical Ellipsoidal |
SA | Slow-Adapting |
FA | Fast-Adapting |
TC-MWTS | Multidimensional Wireless Tactile Sensor |
TNFSL | Triboelectric Normal Force Sensing Layer |
CSFSL | Capacitive Shear Forces Sensing Layer |
MSEL | Middle Soft Elastic Layer |
CD-TENG | Contact-Discharge Triboelectric Nanogenerator |
ZOGW | MXene-Embedded ZnO Nanowire Arrays |
AMSPP | Aligned Segmental Polyimide/Polyurethane Conductive Film |
MEMS | Micro-Electromechanical Systems |
Si-NM | Silicon Nanomembrane |
EGaIn | Eutectic Gallium-Indium |
AiFoam | Artificial Foam |
PCB | Printed Circuit Board |
LED | Light-Emitting Diode |
OPD | Organic Photodiode |
RGB LEDs | Red Green Blue Light-Emitting Diodes |
CMOS | Complementary Metal-Oxide-Semiconductor |
NIR | Near-Infrared |
MIR | Mid-Infrared |
MXene/LNF/PS | MXene/Lotus Nanofiber/Polystyrene Microsphere |
TENG | Triboelectric Nanogenerator |
FTS | Finger-shaped Tactile Sensor |
PLA | Polylactide |
IEm-skin | Ionic Electronic Skin |
AI | Artificial Intelligence |
SVM | Support Vector Machines |
KNN | K-Nearest Neighbors |
LSTM | Long Short-Term Memory |
3DAE-Skin | 3D-Architectured Electronic Skin |
FBTS | Flexible Bionic Tactile Sensor |
AR/VR | Augmented Reality/Virtual Reality |
MNF | Micro-Nano Fiber |
References
- Hong, J.; Xiao, Y.; Chen, Y.; Duan, S.; Xiang, S.; Wei, X.; Zhang, H.; Liu, L.; Xia, J.; Lei, W.; et al. Body-Coupled-Driven Object-Oriented Natural Interactive Interface. Adv. Mater. 2025, 07067. [Google Scholar] [CrossRef]
- Ratschat, A.L.; van Rooij, B.M.; Luijten, J.; Marchal-Crespo, L. Evaluating tactile feedback in addition to kinesthetic feedback for haptic shape rendering: A pilot study. Front. Robot. AI 2024, 11, 1298537. [Google Scholar] [CrossRef]
- Wei, Y.; Marshall, A.G.; McGlone, F.P.; Makdani, A.; Zhu, Y.; Yan, L.; Ren, L.; Wei, G. Human tactile sensing and sensorimotor mechanism: From afferent tactile signals to efferent motor control. Nat. Commun. 2024, 15, 6857. [Google Scholar] [CrossRef] [PubMed]
- Zhang, N.; Ren, J.; Dong, Y.; Yang, X.; Bian, R.; Li, J.; Gu, G.; Zhu, X. Soft robotic hand with tactile palm-finger coordination. Nat. Commun. 2025, 16, 2395. [Google Scholar] [CrossRef] [PubMed]
- Iskandar, M.; Albu-Schäffer, A.; Dietrich, A. Intrinsic sense of touch for intuitive physical human-robot interaction. Sci. Robot. 2024, 9, 4008. [Google Scholar] [CrossRef] [PubMed]
- Xu, J.; Pan, J.; Cui, T.; Zhang, S.; Yang, Y.; Ren, T.L. Recent progress of tactile and force sensors for human–machine interaction. Sensors 2023, 23, 1868. [Google Scholar] [CrossRef]
- Flavin, M.T.; Ha, K.H.; Guo, Z.; Li, S.; Kim, J.T.; Saxena, T.; Simatos, D.; Al-Najjar, F.; Mao, Y.; Bandapalli, S.; et al. Bioelastic state recovery for haptic sensory substitution. Nature 2024, 635, 345–352. [Google Scholar] [CrossRef]
- Liu, Z.; Hu, X.; Bo, R.; Yang, Y.; Cheng, X.; Pang, W.; Liu, Q.; Wang, Y.; Wang, S.; Xu, S.; et al. A three-dimensionally architected electronic skin mimicking human mechanosensation. Science 2024, 384, 987–994. [Google Scholar] [CrossRef]
- Shi, J.; Dai, Y.; Cheng, Y.; Xie, S.; Li, G.; Liu, Y.; Wang, J.; Zhang, R.; Bai, N.; Cai, M.; et al. Embedment of sensing elements for robust, highly sensitive, and cross-talk–free iontronic skins for robotics applications. Sci. Adv. 2023, 9, 8831. [Google Scholar] [CrossRef]
- Yu, Y.; Li, J.; Solomon, S.A.; Min, J.; Tu, J.; Guo, W.; Xu, C.; Song, Y.; Gao, W. All-printed soft human-machine interface for robotic physicochemical sensing. Sci. Robot. 2022, 7, 0495. [Google Scholar] [CrossRef]
- Zhu, P.; Du, H.; Hou, X.; Lu, P.; Wang, L.; Huang, J.; Bai, N.; Wu, Z.; Fang, N.X.; Guo, C.F. Skin-electrode iontronic interface for mechanosensing. Nat. Commun. 2021, 12, 4731. [Google Scholar] [CrossRef]
- Luo, Y.; Abidian, M.R.; Ahn, J.H.; Akinwande, D.; Andrews, A.M.; Antonietti, M.; Bao, Z.; Berggren, M.; Berkey, C.A.; Bettinger, C.J.; et al. Technology roadmap for flexible sensors. ACS Nano 2023, 17, 5211–5295. [Google Scholar] [CrossRef]
- Wang, W.; Jiang, Y.; Zhong, D.; Zhang, Z.; Choudhury, S.; Lai, J.C.; Gong, H.; Niu, S.; Yan, X.; Zheng, Y.; et al. Neuromorphic sensorimotor loop embodied by monolithically integrated, low-voltage, soft e-skin. Science 2023, 380, 735–742. [Google Scholar] [CrossRef]
- Zhang, H.; Hong, J.; Zhu, J.; Duan, S.; Xia, M.; Chen, J.; Sun, B.; Xi, M.; Gao, F.; Xiao, Y.; et al. Humanoid electronic-skin technology for the era of Artificial Intelligence of Things. Matter 2025, 8, 45. [Google Scholar] [CrossRef]
- Wei, X.; Xiang, S.; Meng, C.; Chen, Z.; Cao, S.; Hong, J.; Duan, S.; Liu, L.; Zhang, H.; Shi, Q.; et al. Sensory Fiber-Based Electronic Device as Intelligent and Natural User Interface. Adv. Fiber Mater. 2025, 7, 827–840. [Google Scholar] [CrossRef]
- Libanori, A.; Chen, G.; Zhao, X.; Zhou, Y.; Chen, J. Smart textiles for personalized healthcare. Nat. Electron. 2022, 5, 142–156. [Google Scholar] [CrossRef]
- Islam, M.R.; Afroj, S.; Yin, J.; Novoselov, K.S.; Chen, J.; Karim, N. Advances in printed electronic textiles. Adv. Sci. 2024, 11, 2304140. [Google Scholar] [CrossRef]
- Liu, G.; Fan, B.; Qi, Y.; Han, K.; Cao, J.; Fu, X.; Wang, Z.; Bu, T.; Zeng, J.; Dong, S.; et al. Ultrahigh-current-density tribovoltaic nanogenerators based on hydrogen bond-activated flexible organic semiconductor textiles. ACS Nano 2025, 19, 6771–6783. [Google Scholar] [CrossRef] [PubMed]
- Jiao, H.; Lin, X.; Xiong, Y.; Han, J.; Liu, Y.; Yang, J.; Wu, S.; Jiang, T.; Wang, Z.L.; Sun, Q. Thermal insulating textile based triboelectric nanogenerator for outdoor wearable sensing and interaction. Nano Energy 2024, 120, 109134. [Google Scholar] [CrossRef]
- Ma, J.; Wen, B.; Zhang, Y.; Mao, R.; Wu, Q.; Diao, D.; Xu, K.; Zhang, X. Ultra-broad-range pressure sensing enabled by synchronous-compression mechanism based on microvilli-microstructures sensor. Adv. Funct. Mater. 2025, 35, 2425774. [Google Scholar] [CrossRef]
- Pu, J.; Zhang, Y.; Ning, H.; Tian, Y.; Xiang, C.; Zhao, H.; Liu, Y.; Lee, A.; Gong, X.; Hu, N.; et al. Dual-dielectric-layer-based iontronic pressure sensor coupling ultrahigh sensitivity and wide-Range detection for temperature/pressure dual-mode sensing. Adv. Mater. 2025, 03926. [Google Scholar] [CrossRef]
- Tian, L.; Gao, F.L.; Li, Y.X.; Yang, Z.Y.; Xu, X.; Yu, Z.Z.; Shang, J.; Li, R.W.; Li, X. High-performance bimodal temperature/pressure tactile sensor based on lamellar CNT/MXene/Cellulose nanofibers aerogel with enhanced multifunctionality. Adv. Funct. Mater. 2024, 35, 2418988. [Google Scholar] [CrossRef]
- Yang, D.; Zhao, K.; Yang, R.; Zhou, S.W.; Chen, M.; Tian, H.; Qu, D.H. A rational design of bio-derived disulfide CANs for wearable capacitive pressure sensor. Adv. Mater. 2024, 36, 2403880. [Google Scholar] [CrossRef] [PubMed]
- Niu, H.; Li, H.; Li, N.; Niu, H.; Gao, S.; Yue, W.; Li, Y. Morphological-engineering-based capacitive tactile sensors. Appl. Phys. Rev. 2025, 12, 011319. [Google Scholar] [CrossRef]
- Hu, Y.; Li, P.; Lai, G.; Lu, B.; Wang, H.; Cheng, H.; Wu, M.; Liu, F.; Dang, Z.M.; Qu, L. Separator with high ionic conductivity enables electrochemical capacitors to line-filter at high power. Nat. Commun. 2025, 16, 2772. [Google Scholar] [CrossRef]
- Berman, A.; Hsiao, K.; Root, S.E.; Choi, H.; Ilyn, D.; Xu, C.; Stein, E.; Cutkosky, M.; DeSimone, J.M.; Bao, Z. Additively manufactured micro-lattice dielectrics for multiaxial capacitive sensors. Sci. Adv. 2024, 10, 11. [Google Scholar] [CrossRef]
- Li, Y.; Zhang, S.; Gu, H.; Li, Y.; Zhao, H.; Huang, S.; Feng, X.; Zhai, C.; Xu, M. Ultrasensitive piezoelectric-like film with designed cross-scale pores. Sci. Adv. 2025, 11, 9. [Google Scholar] [CrossRef]
- Yin, H.; Li, Y.; Tian, Z.; Li, Q.; Jiang, C. Ultra-high sensitivity anisotropic piezoelectric sensors for structural health monitoring and robotic perception. Nano-Micro Lett. 2024, 17, 42. [Google Scholar] [CrossRef] [PubMed]
- Wu, Y.; Tang, C.Y.; Wang, S.; Guo, J.; Jing, Q.; Liu, J.; Ke, K.; Wang, Y.; Yang, W. Biomimetic heteromodulus all-fluoropolymer piezoelectric nanofiber mats for highly sensitive acoustic detection. ACS Appl. Mater. Interfaces 2025, 17, 21808–21818. [Google Scholar] [CrossRef]
- Shi, Q.; Sun, Z.; Zhang, Z.; Lee, C. Triboelectric nanogenerators and hybridized systems for enabling next-generation IoT applications. Research 2021, 2021, 6849171. [Google Scholar] [CrossRef]
- Hong, J.; Wei, X.; Zhang, H.; Xiao, Y.; Meng, C.; Chen, Y.; Li, J.; Li, L.; Lee, S.; Shi, Q.; et al. Advances of triboelectric and piezoelectric nanogenerators toward continuous monitoring and multimodal applications in the new era. Int. J. Extrem. Manuf. 2025, 7, 012007. [Google Scholar] [CrossRef]
- Yao, C.; Liu, S.; Liu, Z.; Huang, S.; Sun, T.; He, M.; Xiao, G.; Ouyang, H.; Tao, Y.; Qiao, Y.; et al. Deep learning-enhanced anti-noise triboelectric acoustic sensor for human-machine collaboration in noisy environments. Nat. Commun. 2025, 16, 4276. [Google Scholar] [CrossRef]
- Lin, W.; Xu, Y.; Yu, S.; Wang, H.; Huang, Z.; Cao, Z.; Wei, C.; Chen, Z.; Zhang, Z.; Zhao, Z.; et al. Highly programmable haptic decoding and self-adaptive spatiotemporal feedback toward embodied intelligence. Adv. Funct. Mater. 2025, 35, 2500633. [Google Scholar] [CrossRef]
- Qiao, H.; Sun, S.; Wu, P. Non-equilibrium-Growing Aesthetic Ionic Skin for Fingertip-Like Strain-Undisturbed Tactile Sensation and Texture Recognition. Adv. Mater. 2023, 35, 2300593. [Google Scholar] [CrossRef] [PubMed]
- Rodriguez, A. The unstable queen: Uncertainty, mechanics, and tactile feedback. Sci. Robot. 2021, 6, 4667. [Google Scholar] [CrossRef] [PubMed]
- Wang, S.; Fan, X.; Zhang, Z.; Su, Z.; Ding, Y.; Yang, H.; Zhang, X.; Wang, J.; Zhang, J.; Hu, P. A skin-inspired high-performance tactile sensor for accurate recognition of object softness. ACS Nano 2024, 18, 17175–17184. [Google Scholar] [CrossRef] [PubMed]
- Guo, C.; Chen, X.; Zeng, Z.; Guo, Z.; Li, Y. Grasp like humans: Learning generalizable multi-fingered grasping from human proprioceptive sensorimotor integration. IEEE Trans. Robot. 2025, 36, 1–12. [Google Scholar] [CrossRef]
- Zhao, Z.; Li, W.; Li, Y.; Liu, T.; Li, B.; Wang, M.; Du, K.; Liu, H.; Zhu, Y.; Wang, Q.; et al. Embedding high-resolution touch across robotic hands enables adaptive human-like grasping. Nat. Mach. Intell. 2025, 9, 889–900. [Google Scholar] [CrossRef]
- Duan, S.; Shi, Q.; Wu, J. Multimodal sensors and ML-based data fusion for advanced robots. Adv. Intell. Syst. 2022, 4, 2200213. [Google Scholar] [CrossRef]
- Shi, Q.; Sun, Z.; Le, X.; Xie, J.; Lee, C. Soft robotic perception system with ultrasonic auto-positioning and multimodal sensory intelligence. ACS Nano 2023, 17, 4985–4998. [Google Scholar] [CrossRef]
- Tang, Y.; Li, G.; Zhang, T.; Ren, H.; Yang, X.; Yang, L.; Guo, D.; Shen, Y. Digital channel–enabled distributed force decoding via small datasets for hand-centric interactions. Sci. Adv. 2025, 11, 2641. [Google Scholar] [CrossRef] [PubMed]
- Xu, Q.; Yang, Z.; Wang, Z.; Wang, R.; Zhang, B.; Cheung, Y.; Jiao, R.; Shi, F.; Hong, W.; Yu, H. Sandwich miura-ori enabled large area, super resolution tactile skin for human-machine interactions. Adv. Sci. 2025, 12, 2414580. [Google Scholar] [CrossRef] [PubMed]
- Xu, G.; Wang, H.; Zhao, G.; Fu, J.; Yao, K.; Jia, S.; Shi, R.; Huang, X.; Wu, P.; Li, J.; et al. Self- powered electrotactile textile haptic glove for enhanced human- machine interface. Sci. Adv. 2025, 11, 0318. [Google Scholar] [CrossRef]
- Luo, Y.; Li, Y.; Sharma, P.; Shou, W.; Wu, K.; Foshey, M.; Li, B.; Palacios, T.; Torralba, A.; Matusik, W. Learning human–environment interactions using conformal tactile textiles. Nat. Electron. 2021, 4, 193–201. [Google Scholar] [CrossRef]
- Jung, Y.H.; Yoo, J.Y.; Vázquez-Guardado, A.; Kim, J.H.; Kim, J.T.; Luan, H.; Park, M.; Lim, J.; Shin, H.S.; Su, C.J.; et al. A wireless haptic interface for programmable patterns of touch across large areas of the skin. Nat. Electron. 2022, 5, 374–385. [Google Scholar] [CrossRef]
- Lai, Q.T.; Zhao, X.H.; Sun, Q.J.; Tang, Z.; Tang, X.G.; Roy, V.A. Emerging MXene-based flexible tactile sensors for health monitoring and haptic perception. Small 2023, 19, 2300283. [Google Scholar] [CrossRef]
- Duan, S.; Zhang, H.; Liu, L.; Lin, Y.; Zhao, F.; Chen, P.; Cao, S.; Zhou, K.; Gao, C.; Liu, Z.; et al. A comprehensive review on triboelectric sensors and AI-integrated systems. Mater. Today 2024, 80, 450–480. [Google Scholar] [CrossRef]
- Pyo, S.; Lee, J.; Bae, K.; Sim, S.; Kim, J. Recent progress in flexible tactile sensors for human-interactive systems: From sensors to advanced applications. Adv. Mater. 2021, 33, 2005902. [Google Scholar] [CrossRef]
- Wu, G.; Li, X.; Bao, R.; Pan, C. Innovations in tactile sensing: Microstructural designs for superior flexible sensor performance. Adv. Funct. Mater. 2024, 34, 2405722. [Google Scholar] [CrossRef]
- Gerald, A.; Russo, S. Soft sensing and haptics for medical procedures. Nat. Rev. Mater. 2024, 9, 86–88. [Google Scholar] [CrossRef]
- Nagi, S.S.; Marshall, A.G.; Makdani, A.; Jarocka, E.; Liljencrantz, J.; Ridderström, M.; Shaikh, S.; O’Neill, F.; Saade, D.; Donkervoort, S.; et al. An ultrafast system for signaling mechanical pain in human skin. Sci. Adv. 2019, 5, 1297. [Google Scholar] [CrossRef] [PubMed]
- Peng, Y.; Yang, N.; Xu, Q.; Dai, Y.; Wang, Z. Recent advances in flexible tactile sensors for intelligent systems. Sensors 2021, 21, 5392. [Google Scholar] [CrossRef]
- Han, C.; Cao, Z.; Hu, Y.; Zhang, Z.; Li, C.; Wang, Z.L.; Wu, Z. Flexible tactile sensors for 3D force detection. Nano Lett. 2024, 24, 277–283. [Google Scholar] [CrossRef]
- Wei, C.; Yu, S.; Meng, Y.; Xu, Y.; Hu, Y.; Cao, Z.; Huang, Z.; Liu, L.; Luo, Y.; Chen, H.; et al. Octopus Tentacle-Inspired In-Sensor Adaptive Integral for Edge-Intelligent Touch Intention Recognition. Adv. Mater. 2025, 28, 2420501. [Google Scholar] [CrossRef]
- Sun, X.; Sun, J.; Li, T.; Zheng, S.; Wang, C.; Tan, W.; Zhang, J.; Liu, C.; Ma, T.; Qi, Z.; et al. Flexible tactile electronic skin sensor with 3D force detection based on porous CNTs/PDMS nanocomposites. Nano-Micro Lett. 2019, 11, 57. [Google Scholar] [CrossRef]
- Mao, Q.; Liao, Z.; Liu, S.; Yuan, J.; Zhu, R. An ultralight, tiny, flexible six-axis force/torque sensor enables dexterous fingertip manipulations. Nat. Commun. 2025, 16, 5693. [Google Scholar] [CrossRef]
- Huang, Z.; Yu, S.; Xu, Y.; Cao, Z.; Zhang, J.; Guo, Z.; Wu, T.; Liao, Q.; Zheng, Y.; Chen, Z.; et al. In-Sensor Tactile Fusion and Logic for Accurate Intention Recognition. Adv. Mater. 2024, 36, 2407329. [Google Scholar] [CrossRef]
- Yan, Y.; Zermane, A.; Pan, J.; Kheddar, A. A soft skin with self-decoupled three-axis force-sensing taxels. Nat. Mach. Intell. 2024, 6, 1284–1295. [Google Scholar] [CrossRef]
- Zhu, Y.; Hao, M.; Zhu, X.; Bateux, Q.; Wong, A.; Dollar, A.M. Forces for free: Vision-based contact force estimation with a compliant hand. Sci. Robot. 2025, 10, 11. [Google Scholar] [CrossRef] [PubMed]
- Shen, Z.; Ren, J.; Zhang, N.; Li, J.; Gu, G. Hierarchically-interlocked, three-axis soft iontronic sensor for omnidirectional shear and normal forces. Adv. Mater. Technol. 2025, 10, 2401626. [Google Scholar] [CrossRef]
- Yang, Y.; Zhou, X.; Yang, R.; Zhang, Q.; Sun, J.; Li, G.; Zhao, Y.; Liu, Z. Real-time 3-D force measurements using vision-based flexible force sensor. IEEE Trans. Instrum. Meas. 2023, 73, 1–10. [Google Scholar] [CrossRef]
- Lei, P.; Bao, Y.; Gao, L.; Zhang, W.; Zhu, X.; Liu, C.; Ma, J. Bioinspired integrated multidimensional sensor for adaptive grasping by robotic hands and physical movement guidance. Adv. Funct. Mater. 2024, 34, 2313787. [Google Scholar] [CrossRef]
- Kang, B.; Zavanelli, N.; Sue, G.N.; Patel, D.K.; Oh, S.; Oh, S.; Vinciguerra, M.R.; Wieland, J.; Wang, W.D.; Majidi, C. A flexible skin-mounted haptic interface for multimodal cutaneous feedback. Nat. Electron. 2025, 8, 818–830. [Google Scholar] [CrossRef]
- Zeng, X.; Liu, Y.; Liu, F.; Wang, W.; Liu, X.; Wei, X.; Hu, Y. A bioinspired three-dimensional integrated e-skin for multiple mechanical stimuli recognition. Nano Energy 2022, 92, 106777. [Google Scholar] [CrossRef]
- Kim, G.; Hwang, D. BaroTac: Barometric three-axis tactile sensor with slip detection capability. Sensors 2022, 23, 428. [Google Scholar] [CrossRef] [PubMed]
- Ham, J.; Huh, T.M.; Kim, J.; Kim, J.O.; Park, S.; Cutkosky, M.R.; Bao, Z. Porous dielectric elastomer based flexible multiaxial tactile sensor for dexterous robotic or prosthetic hands. Adv. Mater. Technol. 2023, 8, 2200903. [Google Scholar] [CrossRef]
- Wang, Y.; Duan, S.; Liu, J.; Zhao, F.; Chen, P.; Shi, Q.; Wu, J. Highly-sensitive expandable microsphere-based flexible pressure sensor for human–machine interaction. J. Micromech. Microeng. 2023, 33, 115009. [Google Scholar] [CrossRef]
- Rehan, M.; Saleem, M.M.; Tiwana, M.I.; Shakoor, R.I.; Cheung, R. A soft multi-axis high force range magnetic tactile sensor for force feedback in robotic surgical systems. Sensors 2022, 22, 3500. [Google Scholar] [CrossRef]
- Kebede, G.A.; Ahmad, A.R.; Lee, S.C.; Lin, C.Y. Decoupled six-axis force–moment sensor with a novel strain gauge arrangement and error reduction techniques. Sensors 2019, 19, 3012. [Google Scholar] [CrossRef]
- Liu, G.; Yu, P.; Tao, Y.; Liu, T.; Liu, H.; Zhao, J. Hybrid 3D printed three-axis force sensor aided by machine learning decoupling. Int. J. Smart Nano Mater. 2024, 15, 261–278. [Google Scholar] [CrossRef]
- Wang, S.; Liu, H. Research on decoupling model of six-component force sensor based on artificial neural network and polynomial regression. Sensors 2024, 24, 2698. [Google Scholar] [CrossRef]
- Dai, H.; Wu, Z.; Meng, C.; Zhang, C.; Zhao, P. A magnet splicing method for constructing a three-dimensional self-decoupled magnetic tactile sensor. Magnetochemistry 2024, 10, 6. [Google Scholar] [CrossRef]
- Chun, S.; Kim, J.-S.; Yoo, Y.; Choi, Y.; Jung, S.J.; Jang, D.; Lee, G.; Song, K.-I.; Nam, K.S.; Youn, I.; et al. An artificial neural tactile sensing system. Nat. Electron. 2021, 4, 429–438. [Google Scholar] [CrossRef]
- Bok, B.G.; Jang, J.S.; Kim, M.S. A highly sensitive multimodal tactile sensing module with planar structure for dexterous manipulation of robots. Adv. Intell. Syst. 2023, 5, 2200381. [Google Scholar] [CrossRef]
- Qu, X.; Liu, Z.; Tan, P.; Wang, C.; Liu, Y. Artificial tactile perception smart finger for material identification based on triboelectric sensing. Sci. Adv. 2022, 8, 11. [Google Scholar] [CrossRef]
- Peng, S.; Wu, S.; Yu, Y.; Xia, B.; Lovell, N.H.; Wang, C.H. Multimodal capacitive and piezoresistive sensor for simultaneous measurement of multiple forces. ACS Appl. Mater. Interfaces 2020, 12, 22179–22190. [Google Scholar] [CrossRef]
- Kong, H.; Li, W.; Song, Z.; Niu, L. Recent advances in multimodal sensing integration and decoupling strategies for tactile perception. Mater. Futures 2024, 3, 022501. [Google Scholar] [CrossRef]
- Zhang, J.; Yao, H.; Mo, J.; Chen, S.; Xie, Y.; Ma, S.; Chen, R.; Luo, T.; Ling, W.; Qin, L.; et al. Finger-inspired rigid-soft hybrid tactile sensor with superior sensitivity at high frequency. Nat. Commun. 2022, 13, 5076. [Google Scholar] [CrossRef] [PubMed]
- Oh, H.; Yi, G.C.; Yip, M.; Dayeh, S.A. Scalable tactile sensor arrays on flexible substrates with high spatiotemporal resolution enabling slip and grip for closed-loop robotics. Sci. Adv. 2020, 6, 14. [Google Scholar] [CrossRef] [PubMed]
- Qiu, Y.; Wang, F.; Zhang, Z.; Shi, K.; Song, Y.; Lu, J.; Xu, M.; Qian, M.; Zhang, W.; Wu, J.; et al. Quantitative softness and texture bimodal haptic sensors for robotic clinical feature identification and intelligent picking. Sci. Adv. 2024, 10, 14. [Google Scholar] [CrossRef]
- Gu, Y.; Zhang, T.; Li, J.; Zheng, C.; Yang, M.; Li, S. A new force-decoupling triaxial tactile sensor based on elastic microcones for accurately grasping feedback. Adv. Intell. Syst. 2023, 5, 2200321. [Google Scholar] [CrossRef]
- Boutry, C.M.; Negre, M.; Jorda, M.; Vardoulis, O.; Chortos, A.; Khatib, O.; Bao, Z. A hierarchically patterned, bioinspired e-skin able to detect the direction of applied pressure for robotics. Sci. Robot. 2018, 3, 9. [Google Scholar] [CrossRef]
- Gu, H.; Lu, B.; Gao, Z.; Wu, S.; Zhang, L.; Xie, L.; Yi, J.; Liu, Y.; Nie, B.; Wen, Z.; et al. A battery-free wireless tactile sensor for multimodal force perception. Adv. Funct. Mater. 2024, 34, 2410661. [Google Scholar] [CrossRef]
- Kim, T.; Park, Y.L. A soft three-axis load cell using liquid-filled three-dimensional microchannels in a highly deformable elastomer. IEEE Robot. Autom. Lett. 2018, 3, 881–887. [Google Scholar] [CrossRef]
- Guo, H.; Tan, Y.J.; Chen, G.; Wang, Z.; Susanto, G.J.; See, H.H.; Yang, Z.; Lim, Z.W.; Yang, L.; Tee, B.C. Artificially innervated self-healing foams as synthetic piezo-impedance sensor skins. Nat. Commun. 2020, 11, 5747. [Google Scholar] [CrossRef]
- Cao, Y.; Li, J.; Dong, Z.; Sheng, T.; Zhang, D.; Cai, J.; Jiang, Y. Flexible tactile sensor with an embedded-hair-in-elastomer structure for normal and shear stress sensing. Soft Sci. 2023, 3, 32. [Google Scholar] [CrossRef]
- Xu, C.; Wang, Y.; Zhang, J.; Wan, J.; Xiang, Z.; Nie, Z. Three-dimensional micro strain gauges as flexible, modular tactile sensors for versatile integration with micro- and macroelectronics. Sci. Adv. 2024, 10, 13. [Google Scholar] [CrossRef]
- Won, S.M.; Wang, H.; Kim, B.H.; Lee, K.; Jang, H.; Kwon, K.; Han, M.; Crawford, K.E.; Li, H.; Lee, Y.; et al. Multimodal sensing with a three-dimensional piezoresistive structure. ACS Nano 2019, 13, 10972–10979. [Google Scholar] [CrossRef]
- Wang, X.; Tan, B.; Long, H.; Huang, J.; Li, E.; Qin, Y. Material and structural innovations for high-performance flexible triaxial force sensors: A Review. IEEE Sens. J. 2025, 25, 30291–30312. [Google Scholar] [CrossRef]
- Xu, Y.; Zhang, S.; Li, S.; Wu, Z.; Li, Y.; Li, Z.; Chen, X.; Shi, C.; Chen, P.; Zhang, P.; et al. A soft magnetoelectric finger for robots’ multidirectional tactile perception in non-visual recognition environments. npj Flex. Electron. 2024, 8, 2. [Google Scholar] [CrossRef]
- Lin, C.; Zhang, H.; Xu, J.; Wu, L.; Xu, H. A compact vision-based tactile sensor for accurate 3d shape reconstruction and generalizable 6d force estimation. IEEE Robot. Autom. Lett. 2023, 9, 923–930. [Google Scholar] [CrossRef]
- Heo, M.; Kang, S.R.; Yu, M.; Kwon, T.K. The development of split-treadmill with a fall prevention training function. Technol. Health Care 2023, 31, 1189–1201. [Google Scholar] [CrossRef]
- Mittendorfer, P.; Cheng, G. Humanoid multimodal tactile-sensing modules. IEEE Trans. Robot. 2011, 27, 401–410. [Google Scholar] [CrossRef]
- De Oliveira, T.E.A.; Cretu, A.M.; Petriu, E.M. Multimodal bio-inspired tactile sensing module. IEEE Sens. J. 2017, 17, 3231–3243. [Google Scholar] [CrossRef]
- Pu, M.; Zhao, T.; Zhang, L.; Han, C.; Chai, Z.; Zhou, Y.; Ding, H.; Wu, Z. An AI-Enabled All-In-One Visual, Proximity, and Tactile Perception Multimodal Sensor. Adv. Robot. Res. 2025, 12, 202500062. [Google Scholar] [CrossRef]
- Khamis, H.; Xia, B.; Redmond, S.J. A novel optical 3D force and displacement sensor—Towards instrumenting the papillarray tactile sensor. Sens. Actuat. A-Phys. 2019, 291, 174–187. [Google Scholar] [CrossRef]
- Ma, X.; Wang, C.; Wei, R.; He, J.; Li, J.; Liu, X.; Huang, F.; Ge, S.; Tao, J.; Yuan, Z.; et al. Bimodal tactile sensor without signal fusion for user-interactive applications. ACS Nano 2022, 16, 2789–2797. [Google Scholar] [CrossRef] [PubMed]
- Liu, H.; Yu, Y.; Sun, F.; Gu, J. Visual–tactile fusion for object recognition. IEEE Trans. Autom. Sci. Eng. 2016, 14, 96–108. [Google Scholar] [CrossRef]
- Zhang, Y.; Liu, Q.; Ren, W.; Song, Y.; Luo, H.; Han, Y.; He, L.; Wu, X.; Wang, Z. Bioinspired tactile sensation based on synergistic microcrack-bristle structure design towards high mechanical sensitivity and direction-resolving capability. Research 2023, 6, 0172. [Google Scholar] [CrossRef]
- Bo, R.; Xu, S.; Yang, Y.; Zhang, Y. Mechanically-guided 3D assembly for architected flexible electronics. Chem. Rev. 2023, 123, 11137–11189. [Google Scholar] [CrossRef]
- Cheng, X.; Fan, Z.; Yao, S.; Jin, T.; Lv, Z.; Lan, Y.; Bo, R.; Chen, Y.; Zhang, F.; Shen, Z.; et al. Programming 3D curved mesosurfaces using microlattice designs. Science 2023, 379, 1225–1232. [Google Scholar] [CrossRef]
- Shuai, Y.; Zhao, J.; Bo, R.; Lan, Y.; Lv, Z.; Zhang, Y. A wrinkling-assisted strategy for controlled interface delamination in mechanically-guided 3D assembly. J. Mech. Phys. Solids. 2023, 173, 105–203. [Google Scholar] [CrossRef]
- Xu, S.; Yan, Z.; Jang, K.I.; Huang, W.; Fu, H.; Kim, J.; Wei, Z.; Flavin, M.; McCracken, J.; Wang, R.; et al. Assembly of micro/nanomaterials into complex, three-dimensional architectures by compressive buckling. Science 2015, 347, 154–159. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Zhang, F.; Yan, Z.; Ma, Q.; Li, X.; Huang, Y.; Rogers, J.A. Printing, folding and assembly methods for forming 3D mesostructures in advanced materials. Nat. Rev. Mater. 2017, 2, 17019. [Google Scholar] [CrossRef]
- Yan, Y.; Hu, Z.; Yang, Z.; Yuan, W.; Song, C.; Pan, J.; Shen, Y. Soft magnetic skin for super-resolution tactile sensing with force self-decoupling. Sci. Robot. 2021, 6, 8801. [Google Scholar] [CrossRef]
- Dai, H.; Zhang, C.; Pan, C.; Hu, H.; Ji, K.; Sun, H.; Lyu, C.; Tang, D.; Li, T.; Fu, J.; et al. Split-type magnetic soft tactile sensor with 3D force decoupling. Adv. Mater. 2024, 36, 2310145. [Google Scholar] [CrossRef] [PubMed]
- Dai, H.; Zhang, C.; Hu, H.; Hu, Z.; Sun, H.; Liu, K.; Li, T.; Fu, J.; Zhao, P.; Yang, H. Biomimetic hydrodynamic sensor with whisker array architecture and multidirectional perception ability. Adv. Sci. 2024, 11, 2405276. [Google Scholar] [CrossRef]
- Hu, H.; Zhang, C.; Pan, C.; Dai, H.; Sun, H.; Pan, Y.; Lai, X.; Lyu, C.; Tang, D.; Fu, J.; et al. Wireless flexible magnetic tactile sensor with super-resolution in large-areas. ACS Nano 2022, 16, 19271–19280. [Google Scholar] [CrossRef]
- Zhou, Y.; Zhao, X.; Xu, J.; Chen, G.; Tat, T.; Li, J.; Chen, J. A multimodal magnetoelastic artificial skin for underwater haptic sensing. Sci. Adv. 2024, 10, 8567. [Google Scholar] [CrossRef] [PubMed]
- Jones, D.; Wang, L.; Ghanbari, A.; Vardakastani, V.; Kedgley, A.E.; Gardiner, M.D.; Vincent, T.L.; Culmer, P.R.; Alazmani, A. Design and evaluation of magnetic hall effect tactile sensors for use in sensorized splints. Sensors 2020, 20, 1123. [Google Scholar] [CrossRef]
- Li, G.; Zhang, T.; Tang, J. Decoding chemo-mechanical failure mechanisms of solid-state lithium metal battery under low stack pressure via optical fiber sensors. Adv. Mater. 2025, 37, 12. [Google Scholar] [CrossRef]
- Takeshita, T.; Harisaki, K.; Ando, H.; Higurashi, E.; Nogami, H.; Sawada, R. Development and evaluation of a two-axial shearing force sensor consisting of an optical sensor chip and elastic gum frame. Precis. Eng. 2016, 45, 136–142. [Google Scholar] [CrossRef]
- Wang, W.; De Souza, M.M.; Ghannam, R.; Li, W.J.; Roy, V.A. A novel micro-scaled multi-layered optical stress sensor for force sensing. J. Comput. Electron. 2023, 22, 768–782. [Google Scholar] [CrossRef]
- Wang, W.; Yiu, H.H.; Li, W.; Roy, V.A. The principle and architectures of optical stress sensors and the progress on the development of microbend optical sensors. Adv. Opt. Mater. 2021, 9, 2001693. [Google Scholar] [CrossRef]
- Wang, H.; Wang, W.; Kim, J.J.; Wang, C.; Wang, Y.; Wang, B.; Lee, S.; Yokota, T.; Someya, T. An optical-based multipoint 3-axis pressure sensor with a flexible thin-film form. Sci. Adv. 2023, 9, 2445. [Google Scholar] [CrossRef] [PubMed]
- Yang, H.; Fu, J.; Cao, R.; Liu, J.; Wang, L. A liquid lens-based optical sensor for tactile sensing. Smart Mater. Struct. 2022, 31, 035011. [Google Scholar] [CrossRef]
- Guo, J.; Shang, C.; Gao, S.; Zhang, Y.; Fu, B.; Xu, L. Flexible plasmonic optical tactile sensor for health monitoring and artificial haptic perception. Adv. Mater. Technol. 2023, 8, 2201506. [Google Scholar] [CrossRef]
- Bai, H.; Li, S.; Barreiros, J.; Tu, Y.; Pollock, C.R.; Shepherd, R.F. Stretchable distributed fiber-optic sensors. Science 2020, 370, 848–852. [Google Scholar] [CrossRef]
- Xiong, P.; Huang, Y.; Yin, Y.; Zhang, Y.; Song, A. A novel tactile sensor with multimodal vision and tactile units for multifunctional robot interaction. Robotica 2024, 42, 1420–1435. [Google Scholar] [CrossRef]
- Zhang, Y.; Chen, X.; Wang, M.; Yu, H. Multidimensional tactile sensor with a thin compound eye-inspired imaging system. Soft Robot. 2022, 9, 861–870. [Google Scholar] [CrossRef]
- Li, S.; Yu, H.; Pan, G.; Tang, H.; Zhang, J.; Ye, L.; Zhang, X.P.; Ding, W. M3Tac: A multispectral multimodal visuotactile sensor with beyond-human sensory capabilities. IEEE Trans. Robot. 2024, 40, 4484–4503. [Google Scholar] [CrossRef]
- Leslie, O.; Bulens, D.C.; Ulloa, P.M.; Redmond, S.J. A tactile sensing concept for 3D displacement and 3D force measurement using light angle and intensity sensing. IEEE Sens. J. 2023, 23, 21172–21188. [Google Scholar] [CrossRef]
- Li, H.; Nam, S.; Lu, Z.; Yang, C.; Psomopoulou, E.; Lepora, N.F. Biotactip: A soft biomimetic optical tactile sensor for efficient 3d contact localization and 3d force estimation. IEEE Robot. Autom. Lett. 2024, 9, 5314–5321. [Google Scholar] [CrossRef]
- Li, Z.; Cheng, L.; Liu, Z.; Wei, J.; Wang, Y. An Ultrasensitive and Robust Soft Optical 3D Tactile Sensor. Soft Robot. 2025, 12, 445–454. [Google Scholar] [CrossRef]
- Chen, Y.; Hong, J.; Xiao, Y.; Zhang, H.; Wu, J.; Shi, Q. Multimodal intelligent flooring system for advanced smart-building monitoring and interactions. Adv. Sci. 2024, 11, 2406190. [Google Scholar] [CrossRef]
- Duan, S.; Chen, P.; Xiong, Y.A.; Zhao, F.; Jing, Z.; Du, G.; Wei, X.; Xiang, S.; Hong, J.; Shi, Q.; et al. Flexible mechano-optical dual-responsive perovskite molecular ferroelectric composites for advanced anticounterfeiting and encryption. Sci. Adv. 2024, 10, 11. [Google Scholar] [CrossRef]
- Yu, P.; Chen, F.; Long, J. A three-dimensional force/temperature composite flexible sensor. Sens. Actuators A Phys. 2024, 365, 114891. [Google Scholar] [CrossRef]
- Jin, K.; Li, Z.; Nan, P.; Xin, G.; Lim, K.S.; Ahmad, H.; Yang, H. Fiber Bragg grating-based fingertip tactile sensors for normal/shear forces and temperature detection. Sens. Actuators A Phys. 2023, 357, 114368. [Google Scholar] [CrossRef]
- Chen, C.; Wang, P.; Hong, W.; Zhu, X.; Hao, J.; He, C.; Hou, H.; Kong, D.; Liu, T.; Zhao, Y.; et al. Arch-inspired flexible dual-mode sensor with ultra-high linearity based on carbon nanomaterials/conducting polymer composites for bioelectronic monitoring and thermal perception. Compos. Sci. Technol. 2025, 267, 111182. [Google Scholar] [CrossRef]
- Ikejima, T.; Mizukoshi, K.; Nonomura, Y. Predicting sensory and affective tactile perception from physical parameters obtained by using a biomimetic multimodal tactile sensor. Sensors 2025, 25, 147. [Google Scholar] [CrossRef] [PubMed]
- Duan, S.; Wei, X.; Zhao, F.; Yang, H.; Wang, Y.; Chen, P.; Hong, J.; Xiang, S.; Luo, M.; Shi, Q.; et al. Bioinspired young’s modulus-hierarchical e-skin with decoupling multimodality and neuromorphic encoding outputs to biosystems. Adv. Sci. 2023, 10, 2304121. [Google Scholar] [CrossRef]
- Jiang, Y.; Fan, L.; Sun, X.; Luo, Z.; Wang, H.; Lai, R.; Wang, J.; Gan, Q.; Li, N.; Tian, J. A multifunctional tactile sensory system for robotic intelligent identification and manipulation perception. Adv. Sci. 2024, 11, 2402705. [Google Scholar] [CrossRef]
- Han, C.; Cao, Z.; An, Z.; Zhang, Z.; Wang, Z.L.; Wu, Z. Multimodal finger-shaped tactile sensor for multi-directional force and material identification. Adv. Mater. 2025, 37, 2414096. [Google Scholar] [CrossRef]
- Barreiros, J.A.; Xu, A.; Pugach, S.; Iyengar, N.; Troxell, G.; Cornwell, A.; Hong, S.; Selman, B.; Shepherd, R.F. Haptic perception using optoelectronic robotic flesh for embodied artificially intelligent agents. Sci. Robot. 2022, 7, 6745. [Google Scholar] [CrossRef]
- Huang, Y.; Zhou, J.; Ke, P.; Guo, X.; Yiu, C.K.; Yao, K.; Cai, S.; Li, D.; Zhou, Y.; Li, J.; et al. A skin-integrated multimodal haptic interface for immersive tactile feedback. Nat. Electron. 2023, 6, 1020–1031. [Google Scholar] [CrossRef]
- You, I.; Mackanic, D.G.; Matsuhisa, N.; Kang, J.; Kwon, J.; Beker, L.; Mun, J.; Suh, W.; Kim, T.Y.; Tok, J.B.H.; et al. Artificial multimodal receptors based on ion relaxation dynamics. Science 2020, 370, 961. [Google Scholar] [CrossRef] [PubMed]
- Liu, X.; Li, Y.; Li, Y.A.; Zheng, X.; Guo, J.; Sun, T.; Sung, H.K.; Chernogor, L.; Cao, M.; Xu, T.; et al. Bionic fingerprint tactile sensor with deep learning-decoupled multimodal perception for simultaneous pressure-friction mapping. Adv. Funct. Mater. 2025, e06158. [Google Scholar] [CrossRef]
- Xu, W.; Zhou, G.; Zhou, Y.; Zou, Z.; Wang, J.; Wu, W.; Li, X. A vision-based tactile sensing system for multimodal contact information perception via neural network. IEEE Trans. Instrum. Meas. 2024, 73, 5026411. [Google Scholar] [CrossRef]
- Li, F.; Dai, Z.; Jiang, L.; Song, C.; Zhong, C.; Chen, Y. Prediction of the remaining useful life of bearings through cnn-bi-lstm-based domain adaptation model. Sensors 2024, 24, 6906. [Google Scholar] [CrossRef] [PubMed]
- Piramoon, S.; Ayoubi, M. Neural-network-based active vibration control of rotary machines. IEEE Access 2024, 12, 107552–107569. [Google Scholar] [CrossRef]
- Kou, R.; Wang, C.; Liu, J.; Wan, R.; Jin, Z.; Zhao, L.; Liu, Y.; Guo, J.; Li, F.; Wang, H.; et al. Construction and interpretation of tobacco leaf position discrimination model based on interpretable machine learning. Front. Plant Sci. 2025, 16, 1619380. [Google Scholar] [CrossRef]
- Yu, L.; Xiao, W.; Wang, Q.; Liu, D. Soft microtubular sensors as artificial fingerprints for incipient slip detection. Measurement 2025, 253, 117729. [Google Scholar] [CrossRef]
- Zhang, H.; Liu, Z.; Wu, J.; Shi, Q. Self-Powered Sensing and Wireless Communication Synergic Systems Enabled by Triboelectric Nanogenerators. Nanoenergy Adv. 2024, 4, 367–398. [Google Scholar] [CrossRef]
- Huang, F.; Sun, X.; Shi, Y.; Pan, L. Flexible ionic-gel synapse devices and their applications in neuromorphic system. FlexMat 2025, 2, 30–54. [Google Scholar] [CrossRef]
- Zhang, C.; Zhang, R.; Ji, C.; Pei, Z.; Fu, Z.; Liu, Y.; Sang, S.; Hao, R.; Zhang, Q. Bioinspired crocodile skin-based flexible piezoelectric sensor for three-dimensional force detection. IEEE Sens. J. 2023, 23, 21050–21060. [Google Scholar] [CrossRef]
- Liu, J.; Zhao, W.; Ma, Z.; Zhao, H.; Ren, L. Self-powered flexible electronic skin tactile sensor with 3D force detection. Mater. Today 2024, 81, 84–94. [Google Scholar] [CrossRef]
- Sun, H.; Kuchenbecker, K.J.; Martius, G. A soft thumb-sized vision-based sensor with accurate all-round force perception. Nat. Mach. Intell. 2022, 4, 135–145. [Google Scholar] [CrossRef]
- Liu, F.; Deswal, S.; Christou, A.; Sandamirskaya, Y.; Kaboli, M.; Dahiya, R. Neuro-inspired electronic skin for robots. Sci. Robot. 2022, 7, l7344. [Google Scholar] [CrossRef]
- Zhang, W.; Xi, Y.; Wang, E.; Qu, X.; Yang, Y.; Fan, Y.; Shi, B.; Li, Z. Self-powered force sensors for multidimensional tactile sensing. ACS Appl. Mater. Interfaces 2022, 14, 20122–20131. [Google Scholar] [CrossRef]
- Yao, K.; Zhuang, Q. Self-decoupling three-axis forces in a simple sensor. Nat. Mach. Intell. 2024, 6, 1431–1432. [Google Scholar] [CrossRef]
- Nie, B.; Geng, J.; Yao, T.; Miao, Y.; Zhang, Y.; Chen, X.; Liu, J. Sensing arbitrary contact forces with a flexible porous dielectric elastomer. Mater. Horiz. 2021, 8, 962–971. [Google Scholar] [CrossRef] [PubMed]
- Jiang, C.; Li, Y.; Yin, H.; Li, Y.; Bao, Y.; Li, Q.; Guo, Y. Multiscale interconnected and anisotropic morphology genetic piezoceramic skeleton based flexible self-powered 3D force sensor. Adv. Funct. Mater. 2025, 15, 2503120. [Google Scholar] [CrossRef]
- Liu, X.; Li, K.; Qian, S.; Niu, L.; Chen, W.; Wu, H.; Song, X.; Zhang, J.; Bi, X.; Yu, J.; et al. A high-sensitivity flexible bionic tentacle sensor for multidimensional force sensing and autonomous obstacle avoidance applications. Microsy. Nanoeng. 2024, 10, 149. [Google Scholar] [CrossRef] [PubMed]
- Arshad, A.; Saleem, M.M.; Tiwana, M.I.; ur Rahman, H.; Iqbal, S.; Cheung, R. A high sensitivity and multi-axis fringing electric field based capacitive tactile force sensor for robot assisted surgery. Sens. Actuators A Phys. 2023, 354, 114272. [Google Scholar] [CrossRef]
- Zhu, Y.; Li, Y.; Xie, D.; Yan, B.; Wu, Y.; Zhang, Y.; Wang, G.; Lai, L.; Sun, Y.; Yang, Z.; et al. High-performance flexible tactile sensor enabled by multi-contact mechanism for normal and shear force measurement. Nano Energy 2023, 117, 108862. [Google Scholar] [CrossRef]
- Lv, Z.; Song, Z.; Ruan, D.; Wu, H.; Liu, A. Flexible capacitive three-dimensional force sensor for hand motion capture and handwriting recognition. Funct. Mater. Lett. 2022, 15, 2250026. [Google Scholar] [CrossRef]
- Zhang, J.; Hou, X.; Qian, S.; Huo, J.; Yuan, M.; Duan, Z.; Song, X.; Wu, H.; Shi, S.; Geng, W.; et al. Flexible wide-range multidimensional force sensors inspired by bones embedded in muscle. Microsyst. Nanoeng. 2024, 10, 64. [Google Scholar] [CrossRef]
- Wang, Y.; Ruan, X.; Xing, C.; Zhao, H.; Luo, M.; Chen, Y. Highly sensitive and flexible three-dimensional force tactile sensor based on inverted pyramidal structure. Smart Mater. Struct. 2022, 31, 095013. [Google Scholar] [CrossRef]
- Ruan, D.; Chen, G.; Luo, X.; Cheng, L.; Wu, H.; Liu, A. Bionic octopus-like flexible three-dimensional force sensor for meticulous handwriting recognition in human-computer interactions. Nano Energy 2024, 123, 109357. [Google Scholar] [CrossRef]
- Xie, Y.; Pan, J.; Yu, L.; Fang, H.; Yu, S.; Zhou, N.; Tong, L.; Zhang, L. Optical micro/nanofiber enabled multiaxial force sensor for tactile visualization and human–machine interface. Adv. Sci. 2024, 11, 2404343. [Google Scholar] [CrossRef] [PubMed]
- Fei, Z.; Ryeznik, Y.; Sverdlov, O.; Tan, C.W.; Wong, W.K. An overview of healthcare data analytics with applications to the COVID-19 pandemic. IEEE Trans. Big Data. 2021, 8, 1463–1480. [Google Scholar] [CrossRef]
- Tan, C.W.; Yu, P.D.; Chen, S.; Poor, H.V. DeepTrace: Learning to optimize contact tracing in epidemic networks with graph neural networks. IEEE Trans. Signal Inf. Process. Over Netw. 2025, 11, 97–113. [Google Scholar] [CrossRef]
- Alotaibi, A. Flexible 3D force sensor based on polymer nanocomposite for soft robotics and medical applications. Sensors 2024, 24, 1859. [Google Scholar] [CrossRef] [PubMed]
- Dong, T.; Wang, J.; Chen, Y.; Liu, L.; You, H.; Li, T. Research progress on flexible 3-D force sensors: A review. IEEE Sens. J. 2024, 24, 15706–15726. [Google Scholar] [CrossRef]
- Zhang, Q.; Yang, R.; Duan, Q.; Zhao, Y.; Qian, Z.; Luo, D.; Liu, Z.; Wang, R. A wearable three-axis force sensor based on deep learning technology for plantar measurement. Chem. Eng. J. 2024, 482, 148491. [Google Scholar] [CrossRef]
- Hu, J.; Qiu, Y.; Wang, X.; Jiang, L.; Lu, X.; Li, M.; Wang, Z.; Pang, K.; Tian, Y.; Zhang, W.; et al. Flexible six-dimensional force sensor inspired by the tenon-and-mortise structure of ancient Chinese architecture for orthodontics. Nano Energy 2022, 96, 107073. [Google Scholar] [CrossRef]
- Zhang, D.; Zhang, W.; Yang, H.; Yang, H. Application of soft grippers in the field of agricultural harvesting: A review. Machines 2025, 13, 55. [Google Scholar] [CrossRef]
- Visentin, F.; Castellini, F.; Muradore, R. A soft, sensorized gripper for delicate harvesting of small fruits. Comput. Electron. Agric. 2023, 213, 108202. [Google Scholar] [CrossRef]
- Navas, E.; Shamshiri, R.R.; Dworak, V.; Weltzien, C.; Fernández, R. Soft gripper for small fruits harvesting and pick and place operations. Front. Robot. AI 2024, 10, 1330496. [Google Scholar] [CrossRef]
- Javed, Y.; Mansoor, M.; Shah, I.A. A review of principles of MEMS pressure sensing with its aerospace applications. Sens. Rev. 2019, 39, 652–664. [Google Scholar] [CrossRef]
- Zhu, L.; Wang, Y.; Mei, D.; Jiang, C. Development of fully flexible tactile pressure sensor with bilayer interlaced bumps for robotic grasping applications. Micromachines 2020, 11, 770. [Google Scholar] [CrossRef]
- Takeda, Y.; Wang, Y.F.; Yoshida, A.; Sekine, T.; Kumaki, D.; Tokito, S. Advancing robotic gripper control with the integration of flexible printed pressure sensors. Adv. Eng. Mater. 2024, 26, 2302031. [Google Scholar] [CrossRef]
- Watanabe, Y.; Sekine, T.; Miura, R.; Abe, M.; Shouji, Y.; Ito, K.; Wang, Y.F.; Hong, J.; Takeda, Y.; Kumaki, D.; et al. Optimization of a soft pressure sensor in terms of the molecular weight of the ferroelectric-polymer sensing layer. Adv. Funct. Mater. 2022, 32, 2107434. [Google Scholar] [CrossRef]
- Hong, W.; Guo, X.; Zhang, T.; Liu, Y.; Yan, Z.; Zhang, A.; Qian, Z.; Wang, J.; Zhang, X.; Jin, C.; et al. Bioinspired engineering of fillable gradient structure into flexible capacitive pressure sensor toward ultra-high sensitivity and wide working range. Macromol. Rapid Commun. 2023, 44, 2300420. [Google Scholar] [CrossRef] [PubMed]
- Li, G.; Liu, S.; Wang, L.; Zhu, R. Skin-inspired quadruple tactile sensors integrated on a robot hand enable object recognition. Sci. Robot. 2020, 5, 8134. [Google Scholar] [CrossRef]
- Guo, D.; Liu, H.; Fang, B.; Sun, F.; Yang, W. Visual affordance guided tactile material recognition for waste recycling. IEEE Trans. Autom Sci. Eng. 2022, 19, 2656–2664. [Google Scholar] [CrossRef]
- Jin, J.; Wang, S.; Zhang, Z.; Mei, D.; Wang, Y. Progress on flexible tactile sensors in robotic applications on objects properties recognition, manipulation and human-machine interactions. Soft Sci. 2023, 3, 8. [Google Scholar] [CrossRef]
- Zhang, Z.; Wang, Y.; Zhang, C.; Zhan, W.; Zhang, Q.; Xue, L.; Xu, Z.; Peng, N.; Jiang, Z.; Ye, Z.; et al. Cilia-inspired magnetic flexible shear force sensors for tactile and fluid monitoring. ACS Appl. Mater. Interfaces 2024, 16, 50524–50533. [Google Scholar] [CrossRef]
- Yun, G.; Hu, Z. Triaxial tactile sensing for next-gen robotics and wearable devices. Smart Mater. Devices 2025, 1, 202518. [Google Scholar] [CrossRef]
- Wang, D.; Zhao, N.; Yang, Z.; Yuan, Y.; Xu, H.; Wu, G.; Zheng, W.; Ji, X.; Bai, N.; Wang, W.; et al. Iontronic capacitance-enhanced flexible three-dimensional force sensor with ultrahigh sensitivity for machine-sensing interface. IEEE Electr. Device Lett. 2023, 44, 2023. [Google Scholar] [CrossRef]
- Yuan, X.; Zhou, J.; Huang, B.; Wang, Y.; Yang, C.; Gui, W. Hierarchical quality-relevant feature representation for soft sensor modeling: A novel deep learning strategy. IEEE Trans. Ind. Inform. 2019, 16, 3721–3730. [Google Scholar] [CrossRef]
- Hu, X.; Liu, Z.; Zhang, Y. Three-dimensionally architected tactile electronic skins. ACS Nano 2025, 19, 14523–14539. [Google Scholar] [CrossRef] [PubMed]
- Wang, S.; Wang, C.; Lin, Q.; Zhang, Y.; Zhang, Y.; Liu, Z.; Luo, Y.; Xu, X.; Han, F.; Jiang, Z. Flexible three-dimensional force sensor of high sensing stability with bonding and supporting composite structure for smart devices. Smart Mater. Struct. 2021, 30, 105004. [Google Scholar] [CrossRef]
- Pang, Y.; Xu, X.; Chen, S.; Fang, Y.; Shi, X.; Deng, Y.; Wang, Z.L.; Cao, C. Skin-inspired textile-based tactile sensors enable multifunctional sensing of wearables and soft robots. Nano Energy 2022, 96, 107137. [Google Scholar] [CrossRef]
- Gao, S.; Dai, Y.; Nathan, A. Tactile and vision perception for intelligent humanoids. Adv. Intell. Syst. 2022, 4, 2100074. [Google Scholar] [CrossRef]
- Patel, S.; Rao, Z.; Yang, M.; Yu, C. Wearable haptic feedback interfaces for augmenting human touch. Adv. Funct. Mater. 2025, 23, 2417906. [Google Scholar] [CrossRef]
- Feng, K.; Lei, M.; Wang, X.; Zhou, B.; Xu, Q. A flexible bidirectional interface with integrated multimodal sensing and haptic feedback for closed-loop human–machine interaction. Adv. Intell. Syst. 2023, 5, 2300291. [Google Scholar] [CrossRef]
- Ge, R.; Yu, Q.; Zhou, F.; Liu, S.; Qin, Y. Dual-modal piezotronic transistor for highly sensitive vertical force sensing and lateral strain sensing. Nat. Commun. 2023, 14, 6315. [Google Scholar] [CrossRef]
- Alsadik, B.; Spreeuwers, L.; Dadrass Javan, F.; Manterola, N. Mathematical camera array optimization for face 3D modeling application. Sensors 2023, 23, 9776. [Google Scholar] [CrossRef]
- Jiang, Y.; Ji, S.; Sun, J.; Huang, J.; Li, Y.; Zou, G.; Salim, T.; Wang, C.; Li, W.; Jin, H.; et al. A universal interface for plug-and-play assembly of stretchable devices. Nature 2023, 614, 456–462. [Google Scholar] [CrossRef]
- Baruah, R.K.; Yoo, H.; Lee, E.K. Interconnection technologies for flexible electronics: Materials, fabrications, and applications. Micromachines 2023, 14, 1131. [Google Scholar] [CrossRef]
- Zhang, X.; Ericksen, O.; Lee, S.; Akl, M.; Song, M.K.; Lan, H.; Pal, P.; Suh, J.M.; Lindemann, S.; Ryu, J.E.; et al. Atomiclift-off of epitaxial membranes for cooling-free infrared detection. Nature 2025, 641, 98–105. [Google Scholar] [CrossRef]
- Zhang, L.; Mo, Y.; Ma, W.; Wang, R.; Wan, Y.; Bao, R.; Pan, C. High-resolution spatial mapping of pressure distribution by a flexible and piezotronics transistor array. ACS Appl. Electron. Mater. 2023, 5, 5823–5830. [Google Scholar] [CrossRef]
- Sun, Q.J.; Lai, Q.T.; Tang, Z.; Tang, X.G.; Zhao, X.H.; Roy, V.A. Advanced Functional Composite Materials toward E-Skin for Health Monitoring and Artificial Intelligence. Adv. Mater. Technol. 2022, 8, 2201088. [Google Scholar] [CrossRef]
- Xiao, Y.; Liu, Y.; Zhang, B.; Chen, P.; Zhu, H.; He, E.; Zhao, J.; Huo, W.; Jin, X.; Zhang, X.; et al. Bio-plausible reconfigurable spiking neuron for neuromorphic computing. Sci. Adv. 2025, 11, 8. [Google Scholar] [CrossRef] [PubMed]
- Yue, W.; Wu, K.; Li, Z.; Zhou, J.; Wang, Z.; Zhang, T.; Yang, Y.; Ye, L.; Wu, Y.; Bu, W.; et al. Physical unclonable in-memory computing for simultaneous protecting private data and deep learning models. Nat. Commun. 2025, 16, 1031. [Google Scholar] [CrossRef]
- Wu, B.; Li, K.; Wang, L.; Yin, K.; Nie, M.; Sun, L. Revolutionizing sensing technologies: A comprehensive review of flexible acceleration sensors. FlexMat 2025, 2, 55–81. [Google Scholar] [CrossRef]
- Zhao, B.; Xin, Z.; Wang, Y.; Wu, C.; Wang, W.; Shi, R.; Peng, R.; Wu, Y.; Xu, L.; Pan, T.; et al. Bioinspired gas-receptor synergistic interaction for high-performance two-dimensional neuromorphic devices. Matter 2025, 8, 13. [Google Scholar] [CrossRef]
- Shao, L.; Zhang, J.; Chen, X.; Xu, D.; Gu, H.; Mu, Q.; Yu, F.; Liu, S.; Shi, X.; Sun, J.; et al. Artificial intelligence-driven distributed acoustic sensing technology and engineering application. PhotoniX 2025, 6, 4. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).