sensors-logo

Journal Browser

Journal Browser

Human–Computer Interaction in Sensor Systems

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: 25 September 2026 | Viewed by 2998

Special Issue Editors


E-Mail Website
Guest Editor
Faculty of Electrical Engineering and Computer Science, University of Maribor, Koroška cesta 46, 2000 Maribor, Slovenia
Interests: empirical research methods; operations research; behavioral operations research; sensor-based process optimization
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Faculty of Electrical Engineering and Computer Science, University of Maribor, Koroška cesta 46, 2000 Maribor, Slovenia
Interests: computer–human interaction; user experience; IoT; web technology; intelligent user interfaces; accessibility; technologies for touchless HCI
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Advances in sensor technologies are reshaping how humans experience and interact with digital systems. Embedded in mobile devices, wearables, smart environments, and industrial platforms, sensors enable continuous monitoring and create new opportunities for intuitive, adaptive, and context-aware interaction. This Special Issue focuses on the human-centered design, usability, accessibility, and inclusivity of sensor-driven systems. Our aim is to bridge sensor data with user experience (UX) research, highlighting how sensing can empower more inclusive, trustworthy, and practical applications in everyday life. While artificial intelligence may support data processing and adaptation, the core emphasis is on interaction design, evaluation, and real-world adoption.

We invite research articles, reviews, and case studies on the following topics:

  • Usability, accessibility, and inclusion in sensor-based systems, including wearables, smart glasses, and immersive XR (VR/AR/MR) environments.
  • Digital interactive solutions that foster participation in the digital society for all users, including those with impairments.
  • Development and testing of advanced assistive technologies using sensors and intelligent software (e.g., mobility support, communication aids, and adaptive educational tools).
  • Adaptive and context-aware interfaces informed by sensor data.
  • Multimodal interaction using vision, speech, gestures, or physiological signals.
  • Applications in healthcare, education, smart cities, and industry.
  • Participatory and co-design methods for human-centered sensor systems.
  • Social and ethical challenges such as privacy, inclusivity, transparency, and trust in sensor-driven interaction.

The goal is to provide a multidisciplinary platform for researchers and practitioners to share advances that bridge sensing technologies with human-centered and inclusive design, ultimately fostering more equitable, usable, and responsible interactive systems that improve everyday user experience and support participation in the digital society.

Dr. Maja Pušnik
Dr. Boštjan Šumak
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • human–computer interaction (HCI)
  • sensor-based systems
  • usability, accessibility, and user experience (UX)
  • digital inclusion and assistive technologies
  • smart glasses and immersive XR (VR/AR/MR)
  • multimodal interaction
  • context-aware systems
  • wearable and mobile sensors
  • smart environments
  • participatory design and co-creation
  • privacy, trust, and ethics

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 9168 KB  
Article
Shared-Control HMI for Tactile-First Traversal Offline Counterfactual Evaluation with Haptic Safety Projection
by Adam Mark Mazurick and Alex Ferworn
Sensors 2026, 26(9), 2719; https://doi.org/10.3390/s26092719 - 28 Apr 2026
Viewed by 277
Abstract
Supervising tactile-first robotic traversal in confined, uncertain spaces poses a challenge: operators must be able to intervene without continuous micromanagement. We present a human–machine interface (HMI) that blends operator commands with safety-constrained autonomy and surfaces risk through synthesized predictive haptic alerts. Using offline, [...] Read more.
Supervising tactile-first robotic traversal in confined, uncertain spaces poses a challenge: operators must be able to intervene without continuous micromanagement. We present a human–machine interface (HMI) that blends operator commands with safety-constrained autonomy and surfaces risk through synthesized predictive haptic alerts. Using offline, log-driven replay of 660 trials, we counterfactually evaluate this HMI without new user studies. Results show consistent improvements: predicted collisions decrease, minimum clearance increases, traversal time and path length improve, and the traversability certificate margin rises. Operator–autonomy disagreement is reduced, with smoother control and fewer heading reversals, particularly under algorithms M2 and M3. Importantly, the synthesized haptic alerts anticipate safety-critical events with positive lead time, achieving high precision and recall as objective measures of informativeness. Together, these findings indicate that shared-control blending with tactile-first autonomy can enhance safety, efficiency, and assurance while reducing conflict between operator intent and autonomy. Contributions include the method (counterfactual shared control with safety projection), metrics for safety/efficiency/assurance/conflict, empirical results across 660 trials, and release of replay and haptic-synthesis artifacts. This positions tactile-first HMI as a practical pathway for safe, low-overhead operator supervision in vision-denied, contact-rich environments. Full article
(This article belongs to the Special Issue Human–Computer Interaction in Sensor Systems)
Show Figures

Figure 1

45 pages, 8329 KB  
Article
HRV-Based Multimodal Physiological Signal Monitoring Using Wearable Biosensors in Human–Computer Interaction: Cognitive Load in Real-Time Strategy Games
by Yunlong Shi, Muyesaier Kuerban, Yiyang Jin, Chaoyue Wang and Lu Chen
Sensors 2026, 26(7), 2181; https://doi.org/10.3390/s26072181 - 1 Apr 2026
Viewed by 916
Abstract
Real-time strategy (RTS) games provide a cognitively demanding and ecologically valid context for investigating workload dynamics in human–computer interaction (HCI). This multimodal study (HRV, NASA-TLX, behavior, interviews) examined multitasking, visual complexity, and decision pressure in 36 novice RTS players. High multitasking significantly increased [...] Read more.
Real-time strategy (RTS) games provide a cognitively demanding and ecologically valid context for investigating workload dynamics in human–computer interaction (HCI). This multimodal study (HRV, NASA-TLX, behavior, interviews) examined multitasking, visual complexity, and decision pressure in 36 novice RTS players. High multitasking significantly increased subjective workload (total raw-TLX: from 22.50 ± 14.65 to 36.47 ± 20.19, p < 0.001) and prolonged completion time (from 317.17 ± 37.26 s to 354.92 ± 50.70 s, p < 0.001). Decision pressure elevated subjective workload (total raw-TLX: from 20 to 28, p = 0.008) without affecting performance. Although HRV did not consistently differentiate experimental conditions at the group level, it showed stable individual-level associations with perceived workload—both in expected directions (e.g., LF power positively correlated with total raw-TLX across four experiments, r = 0.28–0.53, all p < 0.05) and in inverse relationships that deviate from conventional stress models (e.g., stress index negatively correlated with total raw-TLX, r = −0.34 to −0.40, all p < 0.01). These findings suggest that autonomic responses in complex interactive environments may reflect dynamic engagement processes rather than uniform stress activation, supporting multimodal cognitive load assessment and offering transferable insights for interface design and workload evaluation in demanding HCI contexts. Full article
(This article belongs to the Special Issue Human–Computer Interaction in Sensor Systems)
Show Figures

Graphical abstract

31 pages, 3407 KB  
Article
Usability Testing and the System Usability Scale Effectiveness Assessment on Different Sensing Devices of Prototype and Live Web System Counterpart
by Josip Lorincz, Katarina Barišić and Vjeran Vlahović
Sensors 2026, 26(2), 679; https://doi.org/10.3390/s26020679 - 20 Jan 2026
Cited by 1 | Viewed by 1298
Abstract
During the process of digital-system development from prototype to live implementation, differences in user interactions, perceived usability, and overall satisfaction can emerge. These differences often arise due to various factors, which may include the fidelity of the software prototype, the limitations of the [...] Read more.
During the process of digital-system development from prototype to live implementation, differences in user interactions, perceived usability, and overall satisfaction can emerge. These differences often arise due to various factors, which may include the fidelity of the software prototype, the limitations of the prototyping tool, and the complexity of the live digital system. Recognizing these potential usability discrepancies between prototypes and live digital systems, assessment of how well user experience (UX) test approaches, such as usability testing and the System Usability Scale (SUS), reflect the UX in using the digital-system prototype and its counterpart deployed live system emerged as an important research gap. To address this gap, this study compares usability testing and SUS results among a Figma web prototype and its counterpart live web digital system, for the telecom service extension process as a representative digital-system case study. The research study involved a testing process with a total of 10 participants across the Figma prototype and live-web-system test environments, in which different sensing devices that included versatile types of mobile phones were utilized. The research study presents usability testing results related to the overlap in perceived usability issues for the same digital-product developments in both testing environments, which are experienced on different types of mobile sensing devices. The usability testing results are presented as reports on the frequency of occurrence of web system usability issues and corresponding severity levels. The obtained results demonstrated that prototype testing is highly effective for detecting a wide range of usability issues early in the digital-product development phase. The paper also evaluates the predictive capabilities of SUS assessment for the case of the Figma web prototype and its counterpart live web system in the phase of digital-product development. The results show that the SUS evaluation, when applied to digital-system prototype testing, can provide early in the development process a reliable indication of the perceived usability of its counterpart digital system, once it is developed and deployed. The findings presented in the paper offer valuable guidance for software designers and developers seeking to make prototypes and their counterpart real digital-product deployments with improved digital-product overall user experience. Full article
(This article belongs to the Special Issue Human–Computer Interaction in Sensor Systems)
Show Figures

Figure 1

Back to TopTop