Special Issue "3D Human–Computer Interaction"

A special issue of Multimodal Technologies and Interaction (ISSN 2414-4088).

Deadline for manuscript submissions: 30 December 2020.

Special Issue Editors

Dr. Christoph W. Borst
Website
Guest Editor
University of Louisiana at Lafayette, Lafayette, United States
Interests: 3D interaction; Virtual Reality; graphics; haptics; scientific visualization
Dr. Arun K. Kulshreshth
Website
Guest Editor
University of Louisiana at Lafayette, Lafayette, United States
Interests: Human-Computer Interaction (HCI); 3D User Interfaces; 3D interfaces for video games; Virtual Reality

Special Issue Information

Dear colleagues,

This Special Issue explores methods, technologies, and studies of 3D interaction in the broad area of Human–Computer Interaction (HCI). HCI researches the interface between people and computers. 3D user interfaces (3DUI) can involve input devices that track user movements in 3D, techniques for interaction with virtual or augmented reality, or other interfaces in which a 3D arrangement of inputs or environments is characteristic. Like HCI, 3DUI research lies in the intersection of computer science, behavioral sciences, design, media studies, and several other fields of study. This Special Issue invites contributions on the technological, creative, perceptual, cognitive, social, and health aspects of 3DUI.

We encourage authors to submit original research articles, novel case studies, insightful reviews, theoretical and critical perspectives, and well-argued viewpoint articles on 3D Human–Computer Interaction, including but not limited to the following:

  • 3D interaction techniques and metaphors
  • 3D input and sensing technologies
  • 3D feedback for any senses (visual, auditory, haptic, olfactory, gustatory, vestibular)
  • Empirical studies of 3DUIs
  • Novel software architectures for 3DUI
  • Collaborative interfaces for VR, AR, or other 3D computer environments
  • Evaluation methods for 3DUIs
  • Human perception of 3D interaction
  • Novel Applications of 3DUIs: Games, entertainment, CAD, education, etc.
  • Mobile 3DUIs
  • Hybrid 3DUIs
  • Desktop 3DUIs

Of particular interest are articles that critically explore 3D Human–Computer Interaction methods in contexts such as virtual reality, augmented reality, and mobile/wearable devices.

Dr. Christoph W. Borst
Dr. Arun K. Kulshreshth
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Multimodal Technologies and Interaction is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • 3D User Interfaces (3DUI)
  • Spatial user interaction
  • Interaction
  • Interfaces
  • Human-Computer Interaction
  • HCI
  • CHI
  • Virtual Reality (VR)
  • Augmented Reality (AR)
  • Motion tracking
  • Motion controllers
  • Haptics
  • 3D Input
  • Collaborative VR
  • User studies

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Spot-Presentation of Stereophonic Earcons to Assist Navigation for the Visually Impaired
Multimodal Technol. Interact. 2020, 4(3), 42; https://doi.org/10.3390/mti4030042 - 20 Jul 2020
Abstract
This study seeks to demonstrate that a navigation system using stereophonic sound technology is effective in supporting visually impaired people in public spaces. In the proposed method, stereophonic sound is produced by a pair of parametric speakers for a person who comes to [...] Read more.
This study seeks to demonstrate that a navigation system using stereophonic sound technology is effective in supporting visually impaired people in public spaces. In the proposed method, stereophonic sound is produced by a pair of parametric speakers for a person who comes to a specific position, detected by an RGB-D sensor. The sound is a stereophonic earcon representing the target facility. The recipient can intuitively understand the direction of the target facility. The sound is not audible for anyone except for the person being supported and is not noisy. This system is constructed in a shopping mall, and an experiment is conducted, in which the proposed system and guidance by a tactile map lead to a designated facility. As a result, it is confirmed, that the execution time of the proposed method is reduced. It is also confirmed that the proposed method shows higher performance in terms of the average time required to grasp the direction than the tactile map approach. In the actual environment where this system is supposed to be used, the correct answer rate is over 80%. These results suggest that the proposed method can replace the conventional tactile map as a guidance system. Full article
(This article belongs to the Special Issue 3D Human–Computer Interaction)
Show Figures

Figure 1

Open AccessArticle
Augmenting Printed School Atlases with Thematic 3D Maps
Multimodal Technol. Interact. 2020, 4(2), 23; https://doi.org/10.3390/mti4020023 - 27 May 2020
Abstract
Digitalization in schools requires a rethinking of teaching materials and methods in all subjects. This upheaval also concerns traditional print media, like school atlases used in geography classes. In this work, we examine the cartographic technological feasibility of extending a printed school atlas [...] Read more.
Digitalization in schools requires a rethinking of teaching materials and methods in all subjects. This upheaval also concerns traditional print media, like school atlases used in geography classes. In this work, we examine the cartographic technological feasibility of extending a printed school atlas with digital content by augmented reality (AR). While previous research rather focused on topographic three-dimensional (3D) maps, our prototypical application for Android tablets complements map sheets of the Swiss World Atlas with thematically related data. We follow a natural marker approach using the AR engine Vuforia and the game engine Unity. We compare two workflows to insert geo-data, being correctly aligned with the map images, into the game engine. Next, the imported data are transformed into partly animated 3D visualizations, such as a dot distribution map, curved lines, pie chart billboards, stacked cuboids, extruded bars, and polygons. Additionally, we implemented legends, elements for temporal and thematic navigation, a screen capture function, and a touch-based feature query for the user interface. We evaluated our prototype in a usability experiment, which showed that secondary school students are as effective, interested, and sustainable with printed as with augmented maps when solving geographic tasks. Full article
(This article belongs to the Special Issue 3D Human–Computer Interaction)
Show Figures

Figure 1

Back to TopTop