Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (481)

Search Parameters:
Keywords = tactile sensing

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 4688 KB  
Article
Neutral-Axis Ti3C2Tx/GO Sandwich Sensor with Bending Immunity and Deep Learning Tactile Recognition
by Jiahao Qi, Tianshun Gong and Debo Wang
Sensors 2026, 26(8), 2471; https://doi.org/10.3390/s26082471 - 17 Apr 2026
Viewed by 153
Abstract
Flexible piezoresistive sensors are often vulnerable to modal ambiguity and bending-induced drift, both of which can obscure true pressure and strain signals under practical operation. Here, we address these limitations by suppressing bending sensitivity at the device level and disambiguating tactile modes at [...] Read more.
Flexible piezoresistive sensors are often vulnerable to modal ambiguity and bending-induced drift, both of which can obscure true pressure and strain signals under practical operation. Here, we address these limitations by suppressing bending sensitivity at the device level and disambiguating tactile modes at the algorithmic level. We propose and fabricate a Ti3C2Tx/graphene oxide (GO) sandwich sensor in which the conductive network is positioned near the neutral axis, thereby ensuring that bending induces negligible axial strain in the active layer. In contrast, out-of-plane pressing enlarges microcontacts, while in-plane stretching disrupts percolation pathways. We develop a composite-beam model to quantify neutral-axis alignment and the resultant bending immunity, realize the device via a straightforward casting process, and systematically characterize its electromechanical response under bending, pressing, nail pressing, and stretching. To further reduce modal ambiguity and improve tactile recognition, a lightweight one-dimensional convolutional neural network (1D-CNN) was introduced to classify temporal resistance signals from the sensor. Experimental results showed that the 1D-CNN achieved a high classification accuracy of 98.52% under flat-state training and testing conditions, and maintained 96.67% accuracy when evaluated on bending-state samples, demonstrating strong robustness against bending-induced interference. Together, the neutral-axis device architecture and the learning-based inference pipeline deliver high sensitivity to pressing and stretching while markedly suppressing the response to bending, thereby enabling wrist-worn pulse monitoring, soft-robotic joint sensing, and plantar pressure insoles. Full article
(This article belongs to the Section Physical Sensors)
Show Figures

Figure 1

21 pages, 1821 KB  
Review
Tactile and Visual Artificial Synaptic Devices: Progress and Challenges
by Zhifeng Chen, Chengying Chen and Yufei Huang
Electron. Mater. 2026, 7(2), 8; https://doi.org/10.3390/electronicmat7020008 - 15 Apr 2026
Viewed by 364
Abstract
The von Neumann architecture faces a “memory wall” problem due to the physical separation of memory and processor, posing major challenges to energy efficiency and latency in the era of artificial intelligence. To overcome these bottlenecks, artificial synaptic devices inspired by biological systems [...] Read more.
The von Neumann architecture faces a “memory wall” problem due to the physical separation of memory and processor, posing major challenges to energy efficiency and latency in the era of artificial intelligence. To overcome these bottlenecks, artificial synaptic devices inspired by biological systems have emerged as an important research direction. By integrating sensing and computing functions at the device level, these architectures provide a promising approach for the efficient processing of natural physical signals. Supported by advances in functional materials and artificial neural network (ANN) algorithms, artificial synaptic devices are capable of perceiving and processing various external stimuli, showing strong potential for applications in intelligent electronic skins, robotics, and edge computing. This review provides a comprehensive overview of recent advances in artificial synaptic devices, with particular emphasis on tactile and visual sensing applications. We discuss representative device types and operating mechanisms, analyze critical challenges from the perspectives of material engineering and functional integration, and further summarize potential solutions and future trends toward multimodal sensory–memory–computing systems. Full article
(This article belongs to the Special Issue Emerging Trends in Electronic Materials and Functional Nanostructures)
Show Figures

Figure 1

27 pages, 6782 KB  
Article
Development and Evaluation of a Data Glove-Based System for Assisting Puzzle Solving
by Shashank Srikanth Bharadwaj, Kazuma Sato and Lei Jing
Sensors 2026, 26(8), 2341; https://doi.org/10.3390/s26082341 - 10 Apr 2026
Viewed by 413
Abstract
Many hands-on tasks remain difficult to fully automate because they require human dexterity and flexible object handling. Data gloves offer a promising interface for sensing hand–object interactions, but most prior systems focus on gesture recognition or object classification rather than closed-loop, step-by-step task [...] Read more.
Many hands-on tasks remain difficult to fully automate because they require human dexterity and flexible object handling. Data gloves offer a promising interface for sensing hand–object interactions, but most prior systems focus on gesture recognition or object classification rather than closed-loop, step-by-step task guidance. In this work, we develop and evaluate a tactile-sensing operation support system using an e-textile data glove with 88 pressure sensors, a tactile pressure sheet for placement verification, and a GUI that provides step-by-step instructions. As a core component, a CNN classifies the grasped state as bare hand or one of four discs with 93.3% accuracy using 16,175 training samples collected from five participants. In a user study on the Tower of Hanoi task as a controlled proxy for multi-step manipulation, the system reduced mean solving time by 51.5% (from 242.6 s to 117.8 s), reduced the number of disc movements (35.4 to 15, about 20 fewer moves on average), and lowered perceived workload (NASA-TLX) by 53.1% (from 68.5 to 32.1), while achieving a SUS score of 75. These results demonstrate the feasibility of tactile-based step verification and guidance in a controlled multi-step task; broader generalization requires evaluation with larger and more diverse participant groups and tasks. Full article
(This article belongs to the Section Intelligent Sensors)
Show Figures

Figure 1

17 pages, 2659 KB  
Article
Estimation of Fingertip Contact Angle from Tactile Pressure Contours
by Qianqian Tian, Jixiao Liu, Funing Hou and Shijie Guo
Appl. Sci. 2026, 16(7), 3172; https://doi.org/10.3390/app16073172 - 25 Mar 2026
Viewed by 341
Abstract
Tactile sensing is an important perceptual modality that enables robots to understand human contact behaviors. Estimating the fingertip contact angle based on tactile pressure distribution provides a simplified representation of the finger’s contact configuration and supports tactile-based perception in human–robot interaction. However, the [...] Read more.
Tactile sensing is an important perceptual modality that enables robots to understand human contact behaviors. Estimating the fingertip contact angle based on tactile pressure distribution provides a simplified representation of the finger’s contact configuration and supports tactile-based perception in human–robot interaction. However, the relationship between tactile pressure distributions and fingertip contact configuration remains insufficiently understood. In this study, a simplified contact mechanics model was employed to investigate the relationship between tactile pressure characteristics and fingertip contact conditions. Theoretical analysis indicates that both the contact area and the contour dimensions of the pressure distribution are influenced by the contact angle and contact force, with varying sensitivities in different directions to these factors. Based on this theory, simplified finite element modeling of the fingertip and multi-subject experiments were conducted. The deformation behavior of the contact region under different contact angles and contact forces was analyzed. The experimental results were generally consistent with the theoretical analysis. Furthermore, contour descriptors were extracted from the tactile pressure distribution to establish a relationship model for estimating the fingertip contact angle, and the model’s accuracy was analyzed. The experimental results indicate that the extracted contour features exhibit systematic variations with contact angle, and the proposed method achieves a mean absolute error (MAE) of 2.73° and a root mean square error (RMSE) of 7.25°. These results demonstrate that tactile pressure contours provide an effective and computationally efficient cue for estimating fingertip contact configuration. This approach may help robots understand human behavior and has potential applications in human–robot interaction and robotic grasping. Full article
Show Figures

Figure 1

14 pages, 3030 KB  
Article
Universal High-Resolution Copper Patterning on Diverse Substrates via Sequential Laser-Induced Transfer and Electroless Plating
by Yaqiang Ji, Juexuan Xu, Weibin Yin, Yuhao Huang, Ru Pan and Yiming Chen
Micromachines 2026, 17(4), 391; https://doi.org/10.3390/mi17040391 - 24 Mar 2026
Viewed by 406
Abstract
The fabrication of high-resolution and mechanically robust copper patterns remain a critical challenge in flexible electronics. Here, we present a universal metallization strategy that combines sequential two-step laser transfer, including laser-induced backward transfer and laser-induced forward transfer, with subsequent electroless copper plating. In [...] Read more.
The fabrication of high-resolution and mechanically robust copper patterns remain a critical challenge in flexible electronics. Here, we present a universal metallization strategy that combines sequential two-step laser transfer, including laser-induced backward transfer and laser-induced forward transfer, with subsequent electroless copper plating. In this approach, laser-induced backward transfer first generates a transferable copper particle donor layer; subsequently, laser-induced forward transfer selectively embeds these catalytic copper particles into the surface of target substrates, constructing spatially confined activation networks while minimizing direct thermal exposure. These embedded seeds are then amplified into continuous copper conductors via electroless copper plating, achieving a high-resolution pattern (average minimum linewidth of approximately 20 μm) with robust interfacial integrity. Benefiting from laser-induced mechanical interlocking, the resulting copper patterns exhibit a low electrical resistivity of ~2.0 × 10−8 Ω·m (comparable to bulk copper) and maintain stable electromechanical performance even after 8000 bending cycles across a radius range of 3 to 6 mm. Furthermore, the fabricated versatile electrodes are successfully integrated into a triboelectric nanogenerator for tactile sensing and Morse code transmission. With its inherent substrate universality (e.g., polyimide, wood, fabric, and paper) and process scalability, this strategy provides a versatile route for manufacturing reliable copper electrodes in next-generation flexible electronic systems. Full article
(This article belongs to the Special Issue Optical and Laser Material Processing, 2nd Edition)
Show Figures

Figure 1

25 pages, 2904 KB  
Article
Modeling and Design of a Soft Capacitive Slip Sensor with Fluid Dielectric Interlayer
by Elia Landi, Tommaso Lisini Baldi, Michele Pallaoro, Federico Micheletti, Federico Carli and Ada Fort
Micromachines 2026, 17(3), 349; https://doi.org/10.3390/mi17030349 - 12 Mar 2026
Viewed by 371
Abstract
This paper presents the design, modeling, and experimental validation of a capacitive tactile sensor specifically conceived to sense shear-driven contact dynamics in robotic manipulation. The proposed device is a layered flexible capacitive structure, in which controlled tangential interactions are induced. The electrode design [...] Read more.
This paper presents the design, modeling, and experimental validation of a capacitive tactile sensor specifically conceived to sense shear-driven contact dynamics in robotic manipulation. The proposed device is a layered flexible capacitive structure, in which controlled tangential interactions are induced. The electrode design maximizes sensitivity to shear motion and promotes an isotropic response with respect to slip direction, thereby addressing two key limitations that affect the majority of existing slip-sensing technologies. An analytical model was developed to describe the essential relationship between shear-induced displacements and the electrical response, providing insight into the design parameters and supporting the selection of geometry and materials. To test the sensor in real conditions, a dedicated capacitive readout circuit based on high-frequency excitation and synchronous demodulation was developed to robustly acquire capacitance variations while rejecting static offsets and parasitic effects. Several formulations for the interposed dielectric layer material were investigated, including viscous fluids and composite mixtures with high-permittivity nanoparticles, with the aim of improving electrical sensitivity while preserving mechanical stability. Experimental results obtained under controlled loading and sliding conditions demonstrate that the sensor is highly sensitive to changes in contact state and tangential interaction dynamics. The sensor responded consistently to both load-induced shear and slip-related phenomena, enabling the reliable monitoring of contact dynamics rather than binary slip detection. A proof-of-concept integration into a robotic finger confirms the suitability of the proposed approach for grasp monitoring. Full article
(This article belongs to the Special Issue Emerging Trends in Soft Robotics and Bioinspired Technologies)
Show Figures

Figure 1

23 pages, 5494 KB  
Article
A Hybrid-Frequency Sampling Tactile Sensing System Based on a Flexible Piezoresistive Sensor Array: Design and Dynamic Loading Validation
by Zhenxing Wang and Xuan Dou
Sensors 2026, 26(5), 1559; https://doi.org/10.3390/s26051559 - 2 Mar 2026
Viewed by 492
Abstract
A Hybrid-Frequency Sampling Tactile Sensing System Based on a Flexible Piezoresistive Sensor Array is presented for reliable and real-time tactile perception under dynamic loading conditions. While recent studies have developed multi-channel tactile arrays, most systems remain limited by time-dependent drift in channel responses, [...] Read more.
A Hybrid-Frequency Sampling Tactile Sensing System Based on a Flexible Piezoresistive Sensor Array is presented for reliable and real-time tactile perception under dynamic loading conditions. While recent studies have developed multi-channel tactile arrays, most systems remain limited by time-dependent drift in channel responses, inconsistent dynamic behavior, or insufficient temporal resolution under simultaneous loading. In this work, a system-level design integrating a flexible piezoresistive sensor array with a real-time data acquisition module is developed, incorporating a hybrid-frequency sampling strategy to reduce system complexity while preserving reliable dynamic response in key sensing channels. Register-Transfer Level (RTL) simulation verified that the hardware scheduler rigorously executed the deterministic scanning logic, demonstrating a strict one-to-one correspondence with the physical hardware signals. The array consists of 34 piezoresistive sensing nodes embedded in an elastomeric substrate. Under the implemented hybrid-frequency sampling scheme, the system achieves an overall effective acquisition bandwidth of approximately 36.9 kHz, while maintaining a repeatability better than 4.9% and robust mechanical durability under cyclic bending deformation. Dynamic loading validation was performed using a self-developed pressure comparison platform for measuring the normal contact force applied on the tactile surface, serving as ground-truth data to verify that the voltages acquired by the proposed system accurately correspond to the actual applied force. Quantitative analysis shows a strong linear correlation (R2 ≈ 0.98) between the e-skin outputs and the reference forces. The recorded responses exhibit clear intensity-dependent trends and good temporal correspondence among sensing nodes, successfully distinguishing tactile stimuli such as gentle tapping, moderate pressing, and firm contact. The system also captures dynamic tactile responses during finger stroking, showing characteristic multi-unit activation patterns under spatiotemporally varying contact conditions. Compared with previously reported tactile systems typically operating below 100 Hz, the proposed design achieves an approximately 10× enhancement in effective sampling capability while significantly reducing system complexity through hybrid-frequency sampling, thereby supporting reliable dynamic tactile sensing in multi-unit arrays. These results demonstrate that the proposed system provides a practical and scalable hardware platform for dynamic tactile sensing in robotics, human–machine interaction, and wearable tactile systems. Full article
(This article belongs to the Special Issue Advanced Flexible Electronics for Sensing Application)
Show Figures

Figure 1

14 pages, 22807 KB  
Article
A 3D-Force and Torsion Sensor Using Patterned Color Encoding
by Tak Nok Douglas Yu, Hao Ren and Yajing Shen
Sensors 2026, 26(5), 1534; https://doi.org/10.3390/s26051534 - 28 Feb 2026
Viewed by 425
Abstract
Current multi-axis force sensors often rely on complex mechanical structures or arrays of discrete transducers, resulting in larger footprints, higher complexity, and limited scalability for compact applications such as robotic fingertips or wearable tactile interfaces. To address these limitations, this paper introduces a [...] Read more.
Current multi-axis force sensors often rely on complex mechanical structures or arrays of discrete transducers, resulting in larger footprints, higher complexity, and limited scalability for compact applications such as robotic fingertips or wearable tactile interfaces. To address these limitations, this paper introduces a novel optical sensing approach that uses a top-layer patterned color surface and an array of color sensors to decouple and measure normal, shear, and torsional forces within a highly compact 15 × 15 mm footprint. The patterned surface functions as a visual encoding layer, where applied forces induce measurable, direction-dependent shifts in reflected color distribution. By deploying multiple color sensors in an array, each sensor captures localized color variations, enabling spatial reconstruction of both magnitude and direction of applied loads through differential color analysis. The sensor’s performance was validated through robotic gripper integration, where it successfully provided multi-axis force feedback and enabled adaptive gripping force adjustment to achieve robust and stable object manipulation. The experimental results confirm the system’s ability to effectively sensing 3D forces and torsion forces, and support closed-loop control in adaptive robotic grasping. This design presents a scalable, low-profile alternative to conventional multi-axis force sensors, suitable for integration into space-constrained robotic and haptic systems. Full article
(This article belongs to the Special Issue Recent Development of Flexible Tactile Sensors and Their Applications)
Show Figures

Figure 1

15 pages, 4240 KB  
Article
A Sliding-Gated Tactile Interface for Smartphone Side-Key Interaction
by Fengyuan Yang, Wenqiang Yin, Chongxiang Pan, Jia Meng, Panpan Zhang and Xiong Pu
Sensors 2026, 26(5), 1436; https://doi.org/10.3390/s26051436 - 25 Feb 2026
Viewed by 538
Abstract
Achieving precise sliding perception is crucial for enhancing human–machine interactions. Despite the extensive investigation of tactile sensors for static pressure detection, they still face challenges in detecting dynamic information such as sliding direction, speed, pressure and position in interactive touch scenarios. Herein, we [...] Read more.
Achieving precise sliding perception is crucial for enhancing human–machine interactions. Despite the extensive investigation of tactile sensors for static pressure detection, they still face challenges in detecting dynamic information such as sliding direction, speed, pressure and position in interactive touch scenarios. Herein, we propose a self-powered tactile interface that realizes motion-to-electricity generation by electrostatically regulating the carrier concentration and transport in the semiconductive layer with a top gate in sliding movement. This tactile sliding interface can distinguish various dynamic mechanical information by generating voltage signals related to the sliding direction, speed, pressure, and touch position without external bias voltage. By combining machine-learning algorithms, electrical signals of six representative sliding-touch interactions were accurately classified with a recognition accuracy of 98.33%. Furthermore, by integrating sensors into the smartphone’s side button, customizable functions such as volume control, screen unlocking, and music switching were achieved. This work provides an innovative mechanism for sliding sensing in interactive electronic and intelligent control systems. Full article
(This article belongs to the Section Electronic Sensors)
Show Figures

Figure 1

19 pages, 5129 KB  
Article
High-Resolution Contact Localization and Three-Axis Force Estimation with a Sparse Strain-Node Tactile Interface Device
by Yanyan Wu, Hanhan Wu, Yifei Han, Yi Ding, Bosheng Cao and Chongkun Xia
Sensors 2026, 26(4), 1378; https://doi.org/10.3390/s26041378 - 22 Feb 2026
Viewed by 517
Abstract
High-resolution contact localization and three-axis force estimation are crucial for human–robot interaction and precision manipulation, yet the sensing area is limited by channel density and wiring cost. Sparse strain readout makes joint estimation of location and three-axis force challenging due to cross-axis coupling [...] Read more.
High-resolution contact localization and three-axis force estimation are crucial for human–robot interaction and precision manipulation, yet the sensing area is limited by channel density and wiring cost. Sparse strain readout makes joint estimation of location and three-axis force challenging due to cross-axis coupling and nonlinear responses, while dense arrays or extensive calibration increase complexity. We present a sparse strain-node tactile interface device (SSTID) whose three-module layout is optimized via particle swarm optimization to maximize informative response overlap, enabling contact localization (x,y) and three-axis force (Fx,Fy,Fz) estimation using only nine strain channels. We further propose a strain-node contact-state decoding framework (SCDF) implemented with a lightweight multilayer perceptron and trained via a two-stage sim-to-real strategy, including FEM pretraining followed by few-shot real-data adaptation. Experiments demonstrate accurate contact-state decoding with full-workspace characterization, supporting low-cost and scalable deployment of sparse tactile interfaces. Full article
Show Figures

Figure 1

21 pages, 7792 KB  
Article
Optimization of Magnetic Filler Loading and Interstitial Dielectric Percolation for Tunable Triboelectric–Electromagnetic Hybrid Generators
by Geunchul Kim, Jonghwan Lee, Yuseob Lee, Jihwon Keum, Inkyum Kim and Daewon Kim
Micromachines 2026, 17(2), 231; https://doi.org/10.3390/mi17020231 - 11 Feb 2026
Cited by 1 | Viewed by 930
Abstract
In this study, a material-driven strategy is presented to realize tunable triboelectric–electromagnetic hybrid generators while overcoming the form-factor limitations of conventional magnet-assisted systems. A magneto-dielectric hybrid generator (MDHG) was constructed using a soft magnetized dielectric composite, where NdFeB microparticles were embedded in an [...] Read more.
In this study, a material-driven strategy is presented to realize tunable triboelectric–electromagnetic hybrid generators while overcoming the form-factor limitations of conventional magnet-assisted systems. A magneto-dielectric hybrid generator (MDHG) was constructed using a soft magnetized dielectric composite, where NdFeB microparticles were embedded in an Ecoflex matrix and activated by pulse magnetization, allowing a single compliant layer to operate simultaneously as a triboelectric contact medium and a magnetic flux source coupled to a coil. The magnetic filler loading was systematically optimized to elucidate the trade-off between enhanced electromagnetic induction and a non-monotonic triboelectric response governed by dielectric polarization, surface potential, and interfacial energetics. To selectively strengthen the triboelectric branch without sacrificing electromagnetic output, nanoscale BaTiO3 was introduced as an interstitial dielectric phase to promote polarization-active pathways and suppress screening-driven charge-utilization loss. Under contact–separation operation, the optimized MDHG produced triboelectric outputs up to a VOC of 400.40 V and ISC of 56.95 μA, while the electromagnetic branch delivered up to a VOC of 260.04 mV and ISC of 0.89 mA, corresponding to 2.87- and 2.62-fold increases in triboelectric VOC and ISC over pristine Ecoflex. Finally, the hybrid signatures enabled a wearable smart-skin interface capable of decoupling touch occurrence, intensity, and counter-material identity. Full article
(This article belongs to the Special Issue Piezoelectric Microdevices for Energy Harvesting)
Show Figures

Figure 1

23 pages, 4117 KB  
Perspective
Haptic and Palpation Sensing for Robotic Surgery: Engineering Perspectives on Design and Integration
by Michael H. Friebe
Sensors 2026, 26(4), 1126; https://doi.org/10.3390/s26041126 - 10 Feb 2026
Viewed by 1691
Abstract
Robotic-assisted surgery (RAS) provides enhanced dexterity and visualisation but remains constrained by the absence of clinically meaningful palpation and haptic feedback. This perspective examines palpation sensing in RAS from an engineering and system-integration standpoint, identifying the lack of tactile information as a major [...] Read more.
Robotic-assisted surgery (RAS) provides enhanced dexterity and visualisation but remains constrained by the absence of clinically meaningful palpation and haptic feedback. This perspective examines palpation sensing in RAS from an engineering and system-integration standpoint, identifying the lack of tactile information as a major contributor to increased cognitive load, prolonged training, and risk of tissue injury. Recent advances in force, tactile, vibroacoustic, audio, and optical sensor technologies enable quantitative assessment of tissue mechanical properties and often exceed human tactile sensitivity. However, clinical translation is limited by challenges in sensor miniaturisation, sterilisation, robustness and integration and the absence of standardised evaluation metrics. The integration of artificial intelligence and multimodal sensor fusion with intra-operative imaging and augmented visualisation is highlighted as a key strategy to compensate for sensor limitations and biological variability. Dedicated robotic palpation devices and wireless or magnetically coupled probes are discussed as promising transitional solutions. Overall, the restoration of palpation sensing is presented as a prerequisite for improving safety and efficiency and enabling higher levels of autonomy in future RAS platforms. Full article
(This article belongs to the Special Issue Intelligent Optical Sensors in Biomedicine and Robotics)
Show Figures

Figure 1

22 pages, 2447 KB  
Article
Word-Level Motion Learning for Contactless QWERTY Typing with a Single Camera
by Sung-Sic Yoo and Heung-Shik Lee
Sensors 2026, 26(4), 1087; https://doi.org/10.3390/s26041087 - 7 Feb 2026
Viewed by 421
Abstract
Contactless text entry is increasingly important in immersive and constrained computing environments, yet most vision-based approaches rely on character-level recognition or key localization, which are fragile under monocular sensing. This study investigates the feasibility of recognizing natural QWERTY typing motions directly at the [...] Read more.
Contactless text entry is increasingly important in immersive and constrained computing environments, yet most vision-based approaches rely on character-level recognition or key localization, which are fragile under monocular sensing. This study investigates the feasibility of recognizing natural QWERTY typing motions directly at the word level using only a single RGB camera, under a fixed single-user and single-camera configuration. We propose a word-level contactless typing framework that models each word as a distinctive spatiotemporal finger motion pattern derived from hand joint trajectories. Typing motions are temporally segmented, and direction-aware finger displacements are accumulated to construct compact motion representations that are relatively insensitive to absolute hand position and typing duration within the evaluated setup. Each word is represented by multiple motion prototypes that are incrementally updated through online learning with a trial-delayed adaptation protocol. Experiments with vocabularies of up to 200 words show that the proposed approach progressively learns and recalls word-level motion patterns through repeated interaction, achieving stable recognition performance within the tested configuration at realistic typing speeds. Additional evaluations demonstrate that learned motion representations can transfer from physical keyboards to flat-surface typing within the same experimental setting, even when tactile feedback and visual layout cues are reduced. These results support the feasibility of reframing contactless typing as a word-level motion recall problem, and suggest its potential role as a complementary component to character-centric camera-based input methods under constrained monocular sensing. Full article
(This article belongs to the Topic AI Sensors and Transducers)
Show Figures

Figure 1

34 pages, 11602 KB  
Article
Embodied Sensory Experience and Spatial Mapping in Damascene Courtyard Domestic Architecture
by Rasil Sahlabji and Afet Coşkun
Buildings 2026, 16(3), 555; https://doi.org/10.3390/buildings16030555 - 29 Jan 2026
Viewed by 1025
Abstract
Sensory mapping in architecture lacks a guiding theoretical model, leaving practitioners without a clear way to relate spatial design to embodied experience. This study introduces a structured methodology that links phenomenological observation with affordance theory and sensory semiotics, framing sensory data within architectural [...] Read more.
Sensory mapping in architecture lacks a guiding theoretical model, leaving practitioners without a clear way to relate spatial design to embodied experience. This study introduces a structured methodology that links phenomenological observation with affordance theory and sensory semiotics, framing sensory data within architectural contexts. Fieldwork in fourteen courtyard houses of Damascus had residents trace their movements on simplified floor plans, switching colors as sight, sound, touch, smell and taste became dominant. The analysis reveals that visitors pass through a narrow entry corridor, enter the courtyard, and converge at the central fountain, which emerges as a focal point for multiple senses. Residents consistently trace tactile interactions along the fountain’s stone rim and at raised benches in the liwan (open space). Gustatory (taste-related, food-linked) markers appear along the route from kitchen thresholds toward the fountain, suggesting how food preparation and communal gathering overlap. Using 28 sensory maps and a three-level analytical process, comparison, synthesis, and spatial interpretation, the study produced a unified sensory map of the Damascene courtyard house. This pattern highlights how sequential spatial arrangements shape sensory engagement and suggests conservation strategies that preserve these experiential pathways. Architects and conservators can reinforce welcome gestures at thresholds and design water features and planting schemes that invite lingering. The proposed methodology fills the theoretical gap and offers clear guidelines for crafting spaces that respond to human perception. Full article
(This article belongs to the Section Architectural Design, Urban Science, and Real Estate)
Show Figures

Figure 1

14 pages, 8570 KB  
Article
Enhancing Robotic Grasping Detection Using Visual–Tactile Fusion Perception
by Dongyuan Zheng and Yahong Chen
Sensors 2026, 26(2), 724; https://doi.org/10.3390/s26020724 - 21 Jan 2026
Viewed by 894
Abstract
With the advancement of tactile sensors, researchers increasingly integrate tactile perception into robotics, but only for tasks such as object reconstruction, classification, recognition, and grasp state assessment. In this paper, we rethink the relationship between visual and tactile perception and propose a novel [...] Read more.
With the advancement of tactile sensors, researchers increasingly integrate tactile perception into robotics, but only for tasks such as object reconstruction, classification, recognition, and grasp state assessment. In this paper, we rethink the relationship between visual and tactile perception and propose a novel robotic grasping detection method based on visual–tactile perception. Initially, we construct a visual–tactile dataset containing the grasp stability for each potential grasping position. Next, we introduce a novel Grasp Stability Prediction Module (GSPM) to generate a grasp stability probability map, providing prior knowledge regarding grasp stability to the grasp detection network for each possible grasp position. Finally, the map is multiplied element-wise with the corresponding colored image and inputted into the grasp detection network. Experimental results demonstrate that our novel visual–tactile fusion method significantly enhances robotic grasping detection accuracy. Full article
(This article belongs to the Section Sensors and Robotics)
Show Figures

Figure 1

Back to TopTop