Human-Robot Systems: Modeling, Control and Prediction

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: closed (31 August 2020) | Viewed by 6577

Special Issue Editor


E-Mail Website
Guest Editor
AASS MRO Lab, School of Science and Technology, Orebro University, Orebro, Sweden
Interests: automatic control; fuzzy control; robotics; mobile robots

Special Issue Information

Dear Colleagues,

This Special Issue covers intelligent methods of designing intelligent interaction with bidirectional communication, based on collaboration and a symbiosis between human and robot. One aspect is the recognition of human intentions by the robot. Another aspect is the modeling of the human robot system  that reflect the nonlinear nature of the system to be controlled. Another topic is the control problem in the human–robot interaction and the intention to compete  or cooperate in common workspaces, and the corresponding flow of information.  Intelligent  approaches like fuzzy methods, neural nets, machine learning methods, deep learning etc. combined with classical approaches help in all these areas, also for collisions avoidance  and for a co-operation between humans and robots. Prediction and learning of the human–robot behavior based on intelligent methods enables efficient task planning and execution. I invite you to submit your research on these topics,  in the form of original research papers and articles.

Prof. Dr. Rainer Palm
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Human–robot interaction
  • Intention recognition
  • Prediction of human behavior
  • Fuzzy modeling and control
  • Multiple human–robot systems
  • Cooperative intelligence
  • Human-like learning
  • Motion planning
  • Navigation
  • Mapping
  • Localization

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 9499 KiB  
Article
GazeGuide: An Eye-Gaze-Guided Active Immersive UAV Camera
by Pavan Kumar B. N., Adithya Balasubramanyam, Ashok Kumar Patil, Chethana B. and Young Ho Chai
Appl. Sci. 2020, 10(5), 1668; https://doi.org/10.3390/app10051668 - 1 Mar 2020
Cited by 19 | Viewed by 6235
Abstract
Over the years, gaze input modality has been an easy and demanding human–computer interaction (HCI) method for various applications. The research of gaze-based interactive applications has advanced considerably, as HCIs are no longer constrained to traditional input devices. In this paper, we propose [...] Read more.
Over the years, gaze input modality has been an easy and demanding human–computer interaction (HCI) method for various applications. The research of gaze-based interactive applications has advanced considerably, as HCIs are no longer constrained to traditional input devices. In this paper, we propose a novel immersive eye-gaze-guided camera (called GazeGuide) that can seamlessly control the movements of a camera mounted on an unmanned aerial vehicle (UAV) from the eye-gaze of a remote user. The video stream captured by the camera is fed into a head-mounted display (HMD) with a binocular eye tracker. The user’s eye-gaze is the sole input modality to maneuver the camera. A user study was conducted considering the static and moving targets of interest in a three-dimensional (3D) space to evaluate the proposed framework. GazeGuide was compared with a state-of-the-art input modality remote controller. The qualitative and quantitative results showed that the proposed GazeGuide performed significantly better than the remote controller. Full article
(This article belongs to the Special Issue Human-Robot Systems: Modeling, Control and Prediction)
Show Figures

Figure 1

Back to TopTop