sensors-logo

Journal Browser

Journal Browser

Meta-User Interfaces for Ambient Environments

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Internet of Things".

Deadline for manuscript submissions: 20 May 2024 | Viewed by 19273

Special Issue Editors


E-Mail Website
Guest Editor
Faculty of Political Science and Sociopsychological Dynamics, Università degli Studi Internazionali di Roma, Via Cristoforo Colombo 200, 00147 Rome, Italy
Interests: UX; interaction design; learning experience design; mobile applications; smart community; smart city; robotics; IoT; AI; AR
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Electrical Engineering and Computer Science, University of California, Berkeley, USA
Interests: semantic computing; robotic computing; artificial intelligence; biomedical computing and multimedia computing

E-Mail Website
Guest Editor

Special Issue Information

Dear Colleagues,

Sensor-driven systems allow new forms of interaction, such as cross-device interaction, that can improve the engagement of users in ubicomp spaces. This drives the research to focus on the design of new meta-user interfaces to enable the interaction between the user and the surrounding space through different devices and in different forms depending on the characteristics of the ambient environment. Usability issues are related to the spontaneous discoverability of the meta-user interface, the feedforward of the interaction, and the kinds of feedback.

Contributions to this Special Issue are expected to push the boundaries of user interaction within the ambient environment, exploring how sensor technologies can facilitate the design of cross-device interaction and the ways they can improve the usability of meta-user interfaces to gradually engage users in the surrounding space.

Topics of interest include (but are not limited to) the following:

  • Predictive interaction
  • Interaction in IOT
  • Cross-device interaction
  • Ubiquitous interaction
  • Mixed reality in the ambient environment
  • Adaptive and Context-Aware Interfaces
  • Full-body interaction
  • Multitouch interaction
  • Haptic Feedback
  • Gestural interaction
  • User engagement

In smart application domains such as the following:

  • Domotics
  • Health care 
  • Cultural heritage 
  • Smart community and Smart city
  • Smart industry
  • Smart farming
  • Technology Enhanced Learning
  • Vehicle and environment interaction
  • Human-vehicle interaction

Prof. Dr. Marco Romano
Prof. Dr. Phillip C-Y. Sheu
Prof. Dr. Giuliana Vitiello
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Internet of Things (IoT) 
  • Human-Computer Interaction 
  • Artificial intelligence (AI) 
  • Smart ecosystems
  • Smart/Intelligent sensors 
  • Action recognition 
  • Wearable sensors, devices and electronics 
  • Mixed and Augmented realities 
  • User engagement 
  • Cross-device interaction

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 13421 KiB  
Article
TactCube: An Intelligent Device to ‘converse’ with Smart Environments
by Pietro Battistoni, Marianna Di Gregorio, Marco Romano, Monica Sebillo and Giuliana Vitiello
Sensors 2022, 22(14), 5235; https://doi.org/10.3390/s22145235 - 13 Jul 2022
Viewed by 1353
Abstract
Ambient Intelligence is a vision of daily life in which intelligent devices interact with humans to make their lives easier, and the technology is invisible. Artificial Intelligence (AI) governs this smart environment and must interact with humans to best meet their needs and [...] Read more.
Ambient Intelligence is a vision of daily life in which intelligent devices interact with humans to make their lives easier, and the technology is invisible. Artificial Intelligence (AI) governs this smart environment and must interact with humans to best meet their needs and demands. Although voice assistants are very popular and efficient as conversational AI, under some conditions they cannot be used. Therefore, this work proposed a complementary tactile and tangible interface to converse with AI, creating new Tactile Signs. A prototype of TactCube, a wireless cube-shaped device that can interact with AI using only the tactile sense, is presented. The hypothesis is that TactCube can be manipulated with one hand and generate a sequence of numbers that can be interpreted as a new tactile language by a neural network solution. The paper describes the initial research made to define how these sequences can be generated and demonstrates how TactCube is able to do it. Full article
(This article belongs to the Special Issue Meta-User Interfaces for Ambient Environments)
Show Figures

Figure 1

17 pages, 2876 KiB  
Article
Designing Interfaces to Display Sensor Data: A Case Study in the Human-Building Interaction Field Targeting a University Community
by Chiara Ceccarini, Silvia Mirri and Catia Prandi
Sensors 2022, 22(9), 3361; https://doi.org/10.3390/s22093361 - 27 Apr 2022
Cited by 3 | Viewed by 2200
Abstract
The increase of smart buildings with Building Information Modeling (BIM) and Building Management Systems (BMS) has created a large amount of data, including those coming from sensors. These data are intended for monitoring the building conditions by authorized personnel, not being available to [...] Read more.
The increase of smart buildings with Building Information Modeling (BIM) and Building Management Systems (BMS) has created a large amount of data, including those coming from sensors. These data are intended for monitoring the building conditions by authorized personnel, not being available to all building occupants. In this paper, we evaluate, from a qualitative point of view, if a user interface designed for a specific community can increase occupants’ context-awareness about environmental issues within a building, supporting them to make more informed decisions that best suit their needs. We designed a user interface addressed to the student community of a smart campus, adopting an Iterative Design Cycle methodology, and engaged 48 students by means of structured interviews with the aim of collecting their feedback and conducting a qualitative analysis. The results obtained show the interest of this community in having access to information about the environmental data within smart campus buildings. For example, students were more interested in data about temperature and brightness, rather than humidity. As a further result of this study, we have extrapolated a series of design recommendations to support the creation of map-based user interfaces that we found to be effective in such contexts. Full article
(This article belongs to the Special Issue Meta-User Interfaces for Ambient Environments)
Show Figures

Figure 1

19 pages, 43029 KiB  
Article
Interaction Design Patterns for Augmented Reality Fitting Rooms
by Pietro Battistoni, Marianna Di Gregorio, Marco Romano, Monica Sebillo, Giuliana Vitiello and Alessandro Brancaccio
Sensors 2022, 22(3), 982; https://doi.org/10.3390/s22030982 - 27 Jan 2022
Cited by 6 | Viewed by 3953
Abstract
In this work, we explore the role of augmented reality as a meta-user interface, with particular reference to its applications for interactive fitting room systems and the impact on the related shopping experience. Starting from literature and existing systems, we synthesized a [...] Read more.
In this work, we explore the role of augmented reality as a meta-user interface, with particular reference to its applications for interactive fitting room systems and the impact on the related shopping experience. Starting from literature and existing systems, we synthesized a set of nine interaction design patterns to develop AR fitting rooms and to support the shopping experience. The patterns were evaluated through a focus group with possible stakeholders with the aim of evaluating and envisioning the effects on the shopping experience. The focus group analysis shows as a result that the shopping experience related to an AR fitting room based on the proposed patterns is influenced by three main factors, namely: the perception of the utility, the ability to generate interest and curiosity, and the perceived comfort of the interaction and environment in which the system is installed. As a further result, the study shows that the patterns can successfully support these factors, but some elements that emerged from the focus group should be more investigated and taken into consideration by the designers. Full article
(This article belongs to the Special Issue Meta-User Interfaces for Ambient Environments)
Show Figures

Figure 1

21 pages, 1855 KiB  
Article
Patterns for Visual Management in Industry 4.0
by Giuseppe Fenza, Vincenzo Loia and Giancarlo Nota
Sensors 2021, 21(19), 6440; https://doi.org/10.3390/s21196440 - 27 Sep 2021
Cited by 7 | Viewed by 2691
Abstract
The technologies of Industry 4.0 provide an opportunity to improve the effectiveness of Visual Management in manufacturing. The opportunity of improvement is twofold. From one side, Visual Management theory and practice can inspire the design of new software tools suitable for Industry 4.0; [...] Read more.
The technologies of Industry 4.0 provide an opportunity to improve the effectiveness of Visual Management in manufacturing. The opportunity of improvement is twofold. From one side, Visual Management theory and practice can inspire the design of new software tools suitable for Industry 4.0; on the other side, the technology of Industry 4.0 can be used to increase the effectiveness of visual software tools. The paper first explores how the theoretical result on Visual Management can be used as a guideline to improve human-computer interaction, then a methodology is proposed for the design of visual patterns for manufacturing. Four visual patterns are presented that contribute to the solution of problems frequently encountered in discrete manufacturing industries; these patterns help to solve planning and control problems thus providing support to various management functions. Positive implications of this research concern people engagement and empowerment as well as improved problem solving, decision-making and management of manufacturing processes. Full article
(This article belongs to the Special Issue Meta-User Interfaces for Ambient Environments)
Show Figures

Figure 1

15 pages, 2283 KiB  
Article
EXecutive-Functions Innovative Tool (EXIT 360°): A Usability and User Experience Study of an Original 360°-Based Assessment Instrument
by Francesca Borgnis, Francesca Baglio, Elisa Pedroli, Federica Rossetto, Sara Isernia, Lidia Uccellatore, Giuseppe Riva and Pietro Cipresso
Sensors 2021, 21(17), 5867; https://doi.org/10.3390/s21175867 - 31 Aug 2021
Cited by 7 | Viewed by 2208
Abstract
Over the last few decades, several studies have shown the feasibility, acceptability, and efficacy of VR-based instruments in the early evaluation of executive dysfunction (ED) in psychiatric and neurologic conditions. Due to the negative impact of ED in everyday functioning, identifying innovative strategies [...] Read more.
Over the last few decades, several studies have shown the feasibility, acceptability, and efficacy of VR-based instruments in the early evaluation of executive dysfunction (ED) in psychiatric and neurologic conditions. Due to the negative impact of ED in everyday functioning, identifying innovative strategies for evaluating ED allows clinicians to detect executive impairment early and minimize its effects. This work aimed to test the usability and user experience (UX) of EXecutive-functions Innovative Tool 360° (EXIT 360°), a 360°-based tool for assessing ED. Seventy-six healthy subjects underwent an evaluation that involved (1) usability assessment using System Usability Scale and (2) evaluation of UX using the ICT-Sense of Presence and UX Questionnaire. Results showed a satisfactory level of usability (mean = 75.9 ± 12.8), with good scores for usability and learnability. As regards UX, EXIT 360° showed an absence of negative effects (mean = 1.79 ± 0.95) and high scores in ecological validity (mean = 4.32 ± 0.54) and engagement (mean = 3.76 ± 0.56). Moreover, it obtained good scores in efficiency (mean = 1.84 ± 0.84), originality (mean = 2.49 ± 0.71), and attractiveness (mean = 1.93 ± 0.98). Interestingly, demographic characteristics and technological expertise had no impact on the performance (p > 0.05). Overall, EXIT 360° appeared to be a usable, learn-to-use, engaging, and creative tool with irrelevant negative effects. Further studies will be conducted to evaluate these aspects in the clinical population. Full article
(This article belongs to the Special Issue Meta-User Interfaces for Ambient Environments)
Show Figures

Figure 1

25 pages, 1797 KiB  
Article
A Smartphone-Based Cursor Position System in Cross-Device Interaction Using Machine Learning Techniques
by Juechen Yang, Jun Kong and Chunying Zhao
Sensors 2021, 21(5), 1665; https://doi.org/10.3390/s21051665 - 28 Feb 2021
Cited by 2 | Viewed by 1841
Abstract
The use of mobile devices, especially smartphones, has become popular in recent years. There is an increasing need for cross-device interaction techniques that seamlessly integrate mobile devices and large display devices together. This paper develops a novel cross-device cursor position system that maps [...] Read more.
The use of mobile devices, especially smartphones, has become popular in recent years. There is an increasing need for cross-device interaction techniques that seamlessly integrate mobile devices and large display devices together. This paper develops a novel cross-device cursor position system that maps a mobile device’s movement on a flat surface to a cursor’s movement on a large display. The system allows a user to directly manipulate objects on a large display device through a mobile device and supports seamless cross-device data sharing without physical distance restrictions. To achieve this, we utilize sound localization to initialize the mobile device position as the starting location of a cursor on the large screen. Then, the mobile device’s movement is detected through an accelerometer and is accordingly translated to the cursor’s movement on the large display using machine learning models. In total, 63 features and 10 classifiers were employed to construct the machine learning models for movement detection. The evaluation results have demonstrated that three classifiers, in particular, gradient boosting, linear discriminant analysis (LDA), and naïve Bayes, are suitable for detecting the movement of a mobile device. Full article
(This article belongs to the Special Issue Meta-User Interfaces for Ambient Environments)
Show Figures

Figure 1

34 pages, 17178 KiB  
Article
Analysing Touchscreen Gestures: A Study Based on Individuals with Down Syndrome Centred on Design for All
by Jorge Martin-Gutierrez and Marta Sylvia Del Rio Guerra
Sensors 2021, 21(4), 1328; https://doi.org/10.3390/s21041328 - 13 Feb 2021
Cited by 3 | Viewed by 3179
Abstract
There has been a conscious shift towards developing increasingly inclusive applications. However, despite this fact, most research has focused on supporting those with visual or hearing impairments and less attention has been paid to cognitive impairments. The purpose of this study is to [...] Read more.
There has been a conscious shift towards developing increasingly inclusive applications. However, despite this fact, most research has focused on supporting those with visual or hearing impairments and less attention has been paid to cognitive impairments. The purpose of this study is to analyse touch gestures used for touchscreens and identify which gestures are suitable for individuals living with Down syndrome (DS) or other forms of physical or cognitive impairments. With this information, app developers can satisfy Design for All (DfA) requirements by selecting adequate gestures from existing lists of gesture sets. Twenty touch gestures were defined for this study and a sample group containing eighteen individuals with Down syndrome was used. A tool was developed to measure the performance of touch gestures and participants were asked to perform simple tasks that involved the repeated use of these twenty gestures. Three variables are analysed to establish whether they influence the success rates or completion times of gestures, as they could have a collateral effect on the skill with which gestures are performed. These variables are Gender, Type of Down syndrome, and Socioeconomic Status. Analysis reveals that significant difference is present when a pairwise comparison is performed, meaning individuals with DS cannot perform all gestures with the same ease. The variables Gender and Socioeconomic Status do not influence success rates or completion times, but Type of DS does. Full article
(This article belongs to the Special Issue Meta-User Interfaces for Ambient Environments)
Show Figures

Figure 1

Back to TopTop