Special Issue "Human Computer Interaction and Its Future"

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Systems & Control Engineering".

Deadline for manuscript submissions: 30 September 2020.

Special Issue Editor

Dr. Michael Wehner
Website
Guest Editor
Electrical and Computer Engineering, University of California Santa Cruz, USA
Interests: robotics, human-machine interaction, soft systems, wearables, rehabilitation

Special Issue Information

Dear Colleagues,

Humans and computers/machinery have had a complex relationship throughout history. From can-openers to online shopping, we see countless areas in which electromechanical devices can assist us every day. However, our interaction with these systems has been fraught with difficulty. From repetitive stress injuries to the 2008 mortgage meltdown, our interaction with computers, and the difficulties of that relationship have proven to be complex and multifaceted. As our computers and mechatronic systems become more sophisticated, our interactions, problems, and solutions to these problems grow in complexity.

A myriad of technical, personal, and societal difficulties have given rise to a myriad solutions and areas of research to address the difficulties, which often give rise to entirely novel solutions and even new fields of research.

The main aim of this Special Issue is to seek high-quality submissions that highlight emerging methods of identifying the nature of human–machine interaction, quantitatively and qualitatively evaluating the relative risks and merits of the interaction, and studying the possible solutions.

Topics of interest include, but are not limited to, the following:

- Direct human–machine interface: ergonomics, safety, and emerging solutions

- Assistive technologies

- Augmentative technologies

- Robotics: companion robots, workplace robots, healthcare robots, and soft robots

- Wearables: active and passive orthotics, prosthetics, exoskeletons, and wearable sensors

- Gesture recognition and virtual reality

- Social issues in human–computer interactions

Dr. Michael Wehner
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessFeature PaperArticle
Hand Movement Activity-Based Character Input System on a Virtual Keyboard
Electronics 2020, 9(5), 774; https://doi.org/10.3390/electronics9050774 - 08 May 2020
Abstract
Nowadays, gesture-based technology is revolutionizing the world and lifestyles, and the users are comfortable and care about their needs, for example, in communication, information security, the convenience of day-to-day operations and so forth. In this case, hand movement information provides an alternative way [...] Read more.
Nowadays, gesture-based technology is revolutionizing the world and lifestyles, and the users are comfortable and care about their needs, for example, in communication, information security, the convenience of day-to-day operations and so forth. In this case, hand movement information provides an alternative way for users to interact with people, machines or robots. Therefore, this paper presents a character input system using a virtual keyboard based on the analysis of hand movements. We analyzed the signals of the accelerometer, gyroscope, and electromyography (EMG) for movement activity. We explored potential features of removing noise from input signals through the wavelet denoising technique. The envelope spectrum is used for the analysis of the accelerometer and gyroscope and cepstrum for the EMG signal. Furthermore, the support vector machine (SVM) is used to train and detect the signal to perform character input. In order to validate the proposed model, signal information is obtained from predefined gestures, that is, “double-tap”, “hold-fist”, “wave-left”, “wave-right” and “spread-finger” of different respondents for different input actions such as “input a character”, “change character”, “delete a character”, “line break”, “space character”. The experimental results show the superiority of hand gesture recognition and accuracy of character input compared to state-of-the-art systems. Full article
(This article belongs to the Special Issue Human Computer Interaction and Its Future)
Show Figures

Figure 1

Open AccessArticle
A CNN Based Automated Activity and Food Recognition Using Wearable Sensor for Preventive Healthcare
Electronics 2019, 8(12), 1425; https://doi.org/10.3390/electronics8121425 - 29 Nov 2019
Abstract
Recent developments in the field of preventive healthcare have received considerable attention due to the effective management of various chronic diseases including diabetes, heart stroke, obesity, and cancer. Various automated systems are being used for activity and food recognition in preventive healthcare. The [...] Read more.
Recent developments in the field of preventive healthcare have received considerable attention due to the effective management of various chronic diseases including diabetes, heart stroke, obesity, and cancer. Various automated systems are being used for activity and food recognition in preventive healthcare. The automated systems lack sophisticated segmentation techniques and contain multiple sensors, which are inconvenient to be worn in real-life settings. To monitor activity and food together, our work presents a novel wearable system that employs the motion sensors in a smartwatch together with a piezoelectric sensor embedded in a necklace. The motion sensor generates distinct patterns for eight different physical activities including eating activity. The piezoelectric sensor generates different signal patterns for six different food types as the ingestion of each food is different from the others owing to their different characteristics: hardness, crunchiness, and tackiness. For effective representation of the signal patterns of the activities and foods, we employ dynamic segmentation. A novel algorithm called event similarity search (ESS) is developed to choose a segment with dynamic length, which represents signal patterns with different complexities equally well. Amplitude-based features and spectrogram-generated images from the segments of activity and food are fed to convolutional neural network (CNN)-based activity and food recognition networks, respectively. Extensive experimentation showed that the proposed system performs better than the state of the art methods for recognizing eight activity types and six food categories with an accuracy of 94.3% and 91.9% using support vector machine (SVM) and CNN, respectively. Full article
(This article belongs to the Special Issue Human Computer Interaction and Its Future)
Show Figures

Figure 1

Back to TopTop