Next Article in Journal
Validation of Wearable Sensors during Team Sport-Specific Movements in Indoor Environments
Next Article in Special Issue
Using Depth Cameras to Detect Patterns in Oral Presentations: A Case Study Comparing Two Generations of Computer Engineering Students
Previous Article in Journal
Design and Mechanical Sensitivity Analysis of a MEMS Tuning Fork Gyroscope with an Anchored Leverage Mechanism
Previous Article in Special Issue
Introducing Low-Cost Sensors into the Classroom Settings: Improving the Assessment in Agile Practices with Multimodal Learning Analytics
Open AccessArticle

Beyond Reality—Extending a Presentation Trainer with an Immersive VR Module

1
DIPF Leibniz Institute for Research and Information in Education, Rostocker Straße 6, D-60323 Frankfurt am Main, Germany
2
Goethe University Frankfurt, Theodor-W.-Adorno-Platz, D-60323 Frankfurt am Main, Germany
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(16), 3457; https://doi.org/10.3390/s19163457
Received: 18 June 2019 / Revised: 2 August 2019 / Accepted: 4 August 2019 / Published: 7 August 2019
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
The development of multimodal sensor-based applications designed to support learners with the improvement of their skills is expensive since most of these applications are tailor-made and built from scratch. In this paper, we show how the Presentation Trainer (PT), a multimodal sensor-based application designed to support the development of public speaking skills, can be modularly extended with a Virtual Reality real-time feedback module (VR module), which makes usage of the PT more immersive and comprehensive. The described study consists of a formative evaluation and has two main objectives. Firstly, a technical objective is concerned with the feasibility of extending the PT with an immersive VR Module. Secondly, a user experience objective focuses on the level of satisfaction of interacting with the VR extended PT. To study these objectives, we conducted user tests with 20 participants. Results from our test show the feasibility of modularly extending existing multimodal sensor-based applications, and in terms of learning and user experience, results indicate a positive attitude of the participants towards using the application (PT+VR module). View Full-Text
Keywords: sensor-based learning support; Multimodal Learning Analytics; public speaking sensor-based learning support; Multimodal Learning Analytics; public speaking
Show Figures

Figure 1

MDPI and ACS Style

Schneider, J.; Romano, G.; Drachsler, H. Beyond Reality—Extending a Presentation Trainer with an Immersive VR Module. Sensors 2019, 19, 3457.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map

1
Back to TopTop