Next Article in Journal
Numerical Simulation of Random Cavitation Suppression Based on Variable NACA Airfoils
Next Article in Special Issue
An Evaluation of the Effects of a Virtual Museum on Users’ Attitudes towards Cultural Heritage
Previous Article in Journal
Accurate Measurements of Forest Soil Water Content Using FDR Sensors Require Empirical In Situ (Re)Calibration
Previous Article in Special Issue
ATON: An Open-Source Framework for Creating Immersive, Collaborative and Liquid Web-Apps for Cultural Heritage
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Immersive Virtual Reality Experience of Historical Events Using Haptics and Locomotion Simulation

by
Agapi Chrysanthakopoulou
*,
Konstantinos Kalatzis
and
Konstantinos Moustakas
Electrical and Computer Engineering Department, University of Patras, 26504 Patras, Greece
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(24), 11613; https://doi.org/10.3390/app112411613
Submission received: 1 November 2021 / Revised: 25 November 2021 / Accepted: 3 December 2021 / Published: 7 December 2021
(This article belongs to the Special Issue Virtual Reality and Its Application in Cultural Heritage II)

Abstract

:
Virtual reality (VR) and 3D modeling technologies have become increasingly powerful tools for multiple fields, such as education, architecture, and cultural heritage. Museums are no longer places for only placing and exhibiting collections and artworks. They use such technologies to offer a new way of communicating art and history with their visitors. In this paper, we present the initial results of a proposed workflow towards highlighting and interpreting a historic event with the use of an immersive and interactive VR experience and the utilization of multiple senses of the user. Using a treadmill for navigating and haptic gloves for interacting with the environment, combined with the detailed 3D models, deepens the sense of immersion. The results of our study show that engaging multiple senses and visual manipulation in an immersive 3D environment can effectively enhance the perception of visual realism and evoke a stronger sense of presence, amplifying the educational and informative experience in a museum.

1. Introduction

Museums as an institution is a relatively new concept that started by exhibiting private collections and only in recent years has their form begun to alter towards a more educational aspect. Nowadays, museums need to overcome two main issues: authenticity and new museology, meaning that they need to provide an authentic and enhanced experience. These challenges require museums to use a variety of means to transfer knowledge in addition to the simple display of objects. Virtual reality is coming to meet this new need, enabling the user to perceive the interactive knowledge in fully or partially immersed environments. Many museums have already launched VR applications such as The Tate Modern, Museo del Prado, and Paris Museum [1].
A virtual learning environment (VLE) in educational technology is a platform containing the digital aspects of courses of study, usually within educational institutions [2,3]. Such an environment includes resources, activities, and interactions within an educational subject and presents the different stages of evaluation. It can also be integrated with other institutional systems. Virtual learning is the process of learning through computers, tablets, other technological systems, and/or the internet and can be either asynchronous or synchronous. The educational activities can take place online where teachers and learners are in different places or physically in the educational institution. Virtual learning has some very important characteristics and benefits. There is unlimited remote access to countless resources worldwide, every user can have a personalized learning experience based on their needs, level of competence, preferred learning process, it is not bound by time, place, cost, and so forth.

1.1. Related Work

Virtual reality for cultural heritage offers a possible solution to allow visiting inaccessible sites. There are works regarding such immersive environments in which users can navigate in the virtual environment, interact with 3D objects, and be informed using gestural interaction [4]. Multimodal interfaces are essential for creating the necessary sense of presence, that these applications require [5]. Many workflows use a Head-Mounted Display (HMD) device to visualize the data acquired, including the necessary programming for navigation and interactions [6,7,8]. Communicating knowledge regarding cultural heritage with the help of a serious game is another well-established concept and such examples can be found in [9,10,11]. Another important challenge is accessing the underwater cultural wealth. Recent research has focused on the use of Augmented Reality (AR) by divers using cellphones and hybrid markers [12], as well as Virtual Reality (VR) applications for users to explore underwater monuments [13].
Locomotion in VR has been a considerable challenge in the research field for the last few years. Studies show that the most natural experience is when the users are able to physically walk in the real world [14]. Locomotion techniques are mainly split into two categories. The first one includes the techniques that allow the user to move in a way that is not feasible in the real world, using hand-based manipulation either with controllers [15] or without [16,17]. The second category is about the techniques that enable the physical movement of the user, walking, running, and so forth, and can be vehicle- or body-centric [18]. Body-centric approaches are based on repositioning, through a device that constrains them in a fixed position, proxy gestures, and redirected walking [19]. Repositioning can be achieved with active components, like motorized platforms and floor tiles, or utilize passive components like low-friction omnidirectional treadmills or slippery shoes-based interfaces able to decrease the friction produced by the users’ steps. These passive systems offer a simpler and less expensive solution, and are often used for rehabilitation purposes [20,21].
Today, VR technologies are used in a wide range of applications, simulating the real world dynamically and responding in real-time to the users. Integration of haptic feedback in VR is widely accepted as a means to augment the VR experience [22]. By adding force/tactile sensation, the users are expected to be able to perform more complex tasks while being physically aware of the objects they are manipulating [23]. Tactile feedback devices provide input to the user’s skin. They try to recreate the feeling of a shape, a texture, sometimes even the thermal properties of a virtual object. Kinesthetic feedback devices apply forces on the user’s skeleton to create the impression of movement and/or resistance [24]. In real life, when touching an object a person gets both these types of feedback. The goal for haptic devices is to allow the user to feel virtual object features such as folds, textures, edges, temperature, and slipperiness. Together with sight and hearing, utilizing the sense of touch enriches the immersion of the user. A haptic device can also enhance a learning experience not only on an elementary [25] but also on a professional level [26].

1.2. Design and Contribution

Therefore, in this paper, we present the initial results of a proposed workflow towards highlighting and interpreting a historic event with the use of an immersive and interactive VR experience utilizing the user’s multiple senses developed in the framework of the project Personalised interaction with Culture Realities Virtually Enhanced (CuRVE) [27]. More specifically, we present results from tests performed for the reconstructed Antikythera Mechanism, where the user is to navigate through the shipwreck, visit the ship before the plunge and interact with the reconstructed Antikythera Mechanism.
Furthermore, we set up an exploratory study with 20 volunteers, to examine the sense of presence of our prototype and the educational value of our application. The results of the user evaluation will be used as a guideline for our future implementations. The current work explores the potential of a fully immersive VR experience as an educational tool and contributes to the integration of new technologies in classical studies. We aim to develop a game-like environment where the users can learn about a historical event and collect their feedback on what would make such an experience more immersive and educational.
The innovation of our workflow mainly lies in the integration of the equipment to offer a high-level immersion along with the detailed virtual environment and the auditory aspects that offer the user a more realistic experience. Our system can be applied to various historical circumstances from a battle in America to the crowning of a king to the sinking of the Titanic. With the integration of additional elements, such as animated characters, weather and lighting conditions, and even force feedback to the user, the experience would be highly immersive and interactive. Starting from this prototype, and guided by user evaluations every step of the way, a visit to the museum will never be the same.

1.3. The Antikythera Mechanism

The Antikythera Mechanism is an ancient Greek hand-powered orrery, considered to be the first analog computer [28]. Its use was to predict astronomical positions and eclipses of various celestial objects decades in advance. It could also track the four-year cycle of the ancient Olympic games. This artifact was among wreckage retrieved from a shipwreck off the coast of the Greek island Antikythera in 1901 [29]. The wreck also contained numerous statues, coins, and other artifacts dating back to the fourth century BC. The mechanism components found were merged as one lump, and after inspection and conservation efforts it became known that the artifact consisted of various gears. The main fragment and other findings of the Antikythera mechanism are on display at the National Archaeological Museum in Athens.
We chose the specific event because it is a historical acquisition of international interest, and it also provides the opportunity to visit an underwater site of cultural interest. The integration of multisensory interaction and gamification elements enhances the immersion and educational experience.

2. Materials and Methods

2.1. Design of the Virtual Enviroment

For the development of our system, we used the Unreal Engine 4 (UE4) [30]. It includes an editor for creating and managing scenes (levels), materials, characters, animations, and so forth. The engine allows being modified by plugins, written in C++, so that a developer can extend the functionality or graphical user interface if needed. The virtual reality system we utilized was HTC Vive. The Vive components we deployed in our application were a Head-Mounted Display (HMD) for the visualization of our scene, the Vive base stations for room-scale tracking and the Vive trackers for tracking the wrist position and rotation.
The virtual environment consists of two parts. The first is a visualization of the underwater site where the mechanism was found, and the second is on the ship carrying it. Both scenes are represented in a descriptive way and are not historically accurate because the goal was to present the general idea of the potentials of such a system.
The underwater virtual scene is populated with 3D models that represent the various artifacts that divers recovered from the wreck throughout the years such as bronze and marble statues, ceramic jars, bronze and silver coins, and the mechanism itself [31]. It also contains flora and fauna typical of a sea bottom. For an enhanced underwater effect we used the UE4 Volumetric Fog that computes density and lighting at every point in the camera frustum, for any media included and taking into account any number of lighting affecting the fog. This scene was designed in a more game-like manner allowing us to examine if it would help the users be more engaged in the procedure. The goal is to find a certain object to proceed to the next scene. While the user navigates in the scene they find various findings and can read information about them as shown in Figure 1.
The scene on the ship was designed on the basis that the user will be able to navigate freely on the vessel and interact with certain elements. All individual elements follow the laws of physics. The vessel itself has a buoyancy component and moves according to the waves. Some of the elements the user can interact with are:
  • Steer the wheel: the user can grab the wheel with both their hands and navigate the ship (Figure 2). The ship contains two thrusters, left and right, which give the necessary turn force depending on the wheel’s rotation (Figure 3a);
  • Look through the binoculars: When the user goes to the binoculars he can look through them. A render texture is added to the viewport that is taken from a camera with a higher depth ratio so that everything appears closer;
  • Use the Antikythera Mechanism: The user can grab the reconstructed mechanism and rotate it to see the positions of the various stars and planets. Additional information will be given to the user to understand how the mechanism works and what each gear represents (Figure 3b).
Both scenes have realistic lighting rendering and various weather conditions can be applied to them such as rain, snow, wind, and so forth. In the underwater scene, large-scale ray-traced water caustics will be implemented in real-time using cascaded caustic maps, a hybrid method using ray tracing and rasterization, which allows water caustic coverage to wide view distance in real-time [32]. Special effects like smoke, fire, and underwater toroidal bubbles are implemented via the UE4 Cascade Particle System, a fully integrated and modular particle effects editor for real-time editing of various complex effects.

2.2. Locomotion Simulation

From early on [33], it was established that sense of presence is enhanced by using human body movements for locomotion in VR applications. As a result, we wanted to study a body-centric device, which strengthens immersion as it eliminates the real-world limitations and enables the user to wander through unlimited virtual spaces. However, our application already introduced the user to a significant number of devices to utilize, so we wanted a locomotion appliance that would seem the most natural. Furthermore, we wanted the user to be able to move freely while performing other tasks, like interacting with objects or gazing in different directions.
The solution that required the lowest mental effort and provided a satisfying sense of presence was the VR treadmill as stated in [34]. We utilized a commercial solution [35], which uses a flat, low-friction walking surface and a rotating containment ring, which prevents the user from displacing in the physical space. The orientation of the ring defines the walking direction which can be anywhere between 0 and 360 degrees. The device’s ring contains an adjustable belt system so that people with different body types can fit in. Its vertical movement is also flexible and it can adapt to people with different heights.
However, if the VR treadmill is used for a long period of time, it adds certain fatigue to the user. Therefore, we designed our application in a way that allows the users to rest during their experience, while they still remain engaged and immersed. Specifically, in the underwater scene, the user encounters points of interest on their way, where they can stop and read educative information as shown in Figure 1. Likewise, in the second part of the application, the user, while moving freely on the ship, stops to interact with the different elements provided (the steering wheel, the binoculars, and the Antikythera Mechanism). Consequently, we avoid making the users uncomfortable while moving, thus losing their sense of presence.
The integration of the treadmill was made feasible with the device’s Unreal Software Development Kit (SDK). The user’s movement and orientation are detected by the treadmill’s sensors while the user’s location is identified by the HMD sensors. The sensor’s collected data define the following variables:
  • MovementVector: a 3-dimensional vector describing the movement speed value in the three different axes. The MovementVector is also multiplied with a user-defined variable called speedMultiplier so that the movement speed can be adjusted to each individual walking style;
  • OrientationVector: a 4-dimensional vector describing the user’s orientation in quaternions. The OrientationVector is transferred to the player’s local coordinate system by multiplying it with the corresponding transformation matrix or by using an appropriate function provided by the Unreal’s library;
  • UserLocation: a 3-dimensional vector defining the user’s position in the three different axes.
The user’s final location is updated in each frame and calculated as:
U s e r L o c a t i o n = U s e r L o c a t i o n + ( M o v e m e n t V e c t o r × O r i e n t a t i o n V e c t o r × U n r e a l C o n s t a n t × D e l t a S e c o n d s ) ,
where UnrealConstant is a constant variable used for correct unit conversion from the treadmill’s input value to Unreal’s final output. DeltaSeconds, commonly referred to as Delta Time, describes the elapsed time since the previous frame. This is used for a smooth and steady walking animation, regardless of the system’s hardware or the scene’s complexity.

2.3. Hand Tracking and Haptic Feedback

For our application, we utilized the Vive Trackers to support hand tracking. Vive Trackers are enabled with SteamVR and can be programmed through UE4 to update the position and rotation of our virtual hands so they correspond to the user’s hand movements. Specifically, we used them to track the position and rotation of the wrist.
For the haptic feedback, we used a commercially available pair of haptic gloves. These gloves act like wireless controllers for Extended Reality (XR) providing precise hand and finger tracking along with a haptic feedback effect for each finger. It allows interaction with five fingers, with one vibration motor under the last phalange of each finger. The measurements of the fingers and hand movements are provided by inertial sensors. These features create a cheap device, easy to calibrate with a guaranteed high refresh rate, but at the cost of precision.
The integration of these gloves was done with the UE4 plugin provided by the company. The position of the fingers is automatically provided by the sensors on the gloves, but for the haptic feedback, data must be sent dynamically through a User Datagram Protocol (UDP). After a connection is established, the proper JSON string is being sent, which contains information about the desired vibrations. The string includes arguments for the left or right hand, finger index, vibration type, and vibration duration. The whole process is accomplished with two plugins for socket communication. For the visualization and animation of the hand, the default hand mesh of the UE4 is used, but in future implementations, a more realistic hand and/or body mesh with elements of the proper historical time can be used.
The gloves contain several vibration types which can be sent as information in the JSON string with the proper id. Consequently, we altered the vibration type to accompany each interaction with the appropriate tactile feedback. Specifically, in the case of the steering wheel (Section 2.1), we used a pulsating vibration both to the fingers and the wrist of the glove. This recurrent pulse resembles the buoyancy of the ship on the water, which is transferred to the user’s hand, through the wheel. The feedback remains, as long as the user holds the wheel. On the other hand, in the case of the Antikythera Mechanism, we used a continuous vibration only to the fingers of the glove. This feedback is sent only when the user holds and rotates the mechanism’s handle. The motion for reaching, holding and turning the wheel or the handle is performed by the user with their natural hand movement. This is achieved with the combined integration of the Vive Trackers for tracking the wrist and the gloves’ detailed tracking of the fingers’ joints and bones. Our whole system can be seen in Figure 4.

2.4. Educational Content

Even though our proposed system is proof of concept the facts about the Antikythera Mechanism and its way of working are based on research and are accurate. The information about the real shipwreck is given to the user during their underwater exploration as they come across certain items which provide information in text and image format. Additionally, the model of the part of the mechanism found in the shipwreck is a 3D model based on the highest resolution CT scan of the main fragment of the Antikythera Mechanism based on the improvements discussed in [36]. Because of the fact that the Antikythera mechanism was made of bronze and was recovered from the sea 2000 years after the sinking, the metal corroded to the form of a shapeless lump. For this reason, it is difficult to extract all the details with a CT scan and some parts may seem ‘missing’.
The front face of the mechanism has the twelve zodiacal signs distributed in equal 30-degree sectors, the months and days of the Sothic Egyptian calendar, and the pointers indicating the positions of the celestial objects (Figure 5a). On the back panel, there are five dials: two large displays, the Metonic and the Saros, and three smaller indicators, the Games Dial, the Callippic, and the Exeligmos (Figure 5b). The user can operate the mechanism by turning a small hand crank linked with a crown gear to the largest gear, the gear with the name b1 (Figure 6). This moves the date pointer on the front dial about 78 days per full rotation. Every gear in the inside of the mechanism is interlocked so by rotating the hand crank every gear rotates so it simultaneously calculates the position of the Sun and Moon, the moon phase, eclipse, and calendar cycles, and the locations of planets. The movement of each gear is calculated from the formulas shown in Table 1.
All the variables shown in Table 1 correspond to the gears named in Figure 6. These are the gears inside the mechanism including the Freeth and Jones reconstruction [37]. In addition to each gear’s name, in Figure 6 for each gear the number of its cogs is given. Every gear is also represented in the 3D model in the internal of the mechanism as shown in Figure 7.

2.5. User Evaluation

When the system reached a fully functional prototype state, it was evaluated by 20 users who voluntarily agreed to participate in evaluation tests. Upon the completion of the VR experience, the users completed a questionnaire, consisting of 19 questions, based on the Questionnaire of Presence (QoP) [38], the most widely used presence questionnaire for virtual environments. The included questions are related to the four main categories of factors contributing to the sense of presence, Control Factors, Sensory Factors, Distraction Factors, and Realism Factors. Control Factors show at what point the user has control over the simulation environment and are affecting their immersion. Realism Factors assess the environment realism and the consistency of information and affect the user’s engagement since users pay more attention to the virtual environment stimuli. Sensory and Distraction Factors measure the isolation of the user, their senses involved, and the consistency of the information the user receives.
The questionnaire also includes three questions for evaluating the importance of haptic feedback, locomotion through the treadmill, and interacting with the elements of the scene for a more immersive experience. The last four questions aim to assess what the users think about the importance of VR experiences in general, in other fields, and if the specific experience intrigued them into exploring more about cultural heritage.
At first, the users were given a verbal explanation of the navigation and interactions controls. In the first part of the application, the underwater scene, the users could freely explore the seabed but had to find a specific item, the main fragment from the Antikythera mechanism, to move to the next scene, on the ship. During the underwater exploration, the users came across various findings that gave more information about the Antikythera shipwreck and its discovery. After they reached the main fragment they were relocated on the ship carrying the Antikythera mechanism back in the first century BC. There they could move freely around and interact with various objects as described in Section 2.1. There was no time limit and they could terminate the experiment at any point they wanted. The questions asked were designed to rate their overall experience as well as more specific areas such as navigation, interactions, viewing capabilities, learnability, and memorability.

3. Results and Discussion

The users’ feedback was very satisfactory. Our sample was composed of users from the totally inexperienced with VR to the very experienced ( M e a n : 3.00 , S . D . : 1.45 ) and those with a mediocre interest in cultural heritage ( M e a n : 3.70 , S . D . : 1.13 ). The mean value and the standard deviation for all the 19 questions is listed in the Table 2, Table 3 and Table 4.

3.1. Level of Immersion

The data collected from the users’ answers indicated that the level of immersion was high enough to make them feel quite involved in the virtual environment experience. We found no correlation between our users’ experience with VR to any of the questions listed. As illustrated in Table 2, even though the interactions and movement did not seem so natural ( M e a n : 3.75 ,   3.65 , S . D . : 0.64 , 0.81 ), most users experienced a satisfying immersive experience ( M e a n : 4.38 , S . D . : 0.66 ). The visual and the auditory aspects played a major role in their sense of presence, especially when users interacted with objects on the scene, where the responsive environment provided feedback to their actions.

3.2. Aspects of Immersion

Regarding the three different aspects of immersion examined in this paper, object interaction appeared to be users’ first choice. These interactions provided visual feedback, so considering that the most important immersion factor for the users was the visual aspects ( M e a n : 4.65 , S . D . : 0.49 , see Table 2), this result seems consistent. Consequently, these numbers indicate the focus of our further development. Being more natural and close to real-world interactions should be a primary objective as well, judging by the users’ responses. As for the locomotion simulation, the users were more neutral and did not consider it the most important aspect of immersion but still, they stated that it has a positive impact on the level of immersion.

3.3. VR for Cultural Heritage

Concerning the educational advantages of our VR application, there was a major agreement that such systems will greatly benefit the learning experience of cultural heritage material. Furthermore, users shared common confidence that these applications can stimulate the interest not only for cultural heritage but for scientific or artistic fields too. Museums should consider such systems as an essential aid to demonstrate their exhibits. The visualizations and interactive content these applications provide make historic facts better to understand and easier to remember.

3.4. Limitations

However, the data analysis also demonstrated some limitations of our system. Locomotion through the treadmill reported by many as unnatural, nevertheless compelling ( M e a n : 3.65 ,   3.80 , S . D . : 0.81 , 0.77 , see Table 2). Even though the treadmill is designed to resemble the natural walking motion, after a certain amount of usage time, it becomes uncomfortable. This result complies with previous research [34,39], that states VR treadmills create a significant sense of presence, still they have a high fatigue effect. Difficulty in walking around the visual environment weakens the sense of presence and creates discomfort in the user experience. This may explain the fact that locomotion, as an important aspect of immersion, received the lowest score (Table 3).
In our opinion, all of the aforementioned details could be summarized as a design limitation. As we previously mentioned in Section 2.2, we had taken under consideration the possible fatigue effect during the implementation of our application. Nonetheless, the study’s results state that there is still room for improvement. An interesting way to address this situation would be to utilize the haptic feedback of the treadmill. Small vibrations on the users’ feet could indicate that the user reached the boundaries of a predefined path [40] so that they do not wander around in vain. The tension and duration of the vibrations should be carefully designed so that no further fatigue is added. Overall, this could create a more interactive use of the equipment and assist in the overall sense of presence. Further study can evaluate this approach.
Another limitation in our paper concerns the conducted survey. While it provided us with thoughtful results and ideas, it consisted of a limited number of users. Furthermore, the sense of presence was measured only by subjective self-reported measures using a limited number of questions from our questionnaire. Additional studies could certainly focus on physiological and psychological measurements for a more comprehensive conclusion. Similar to [41] these measurements can be used along with an evaluation on users’ motion sickness.

4. Conclusions

Our developed system utilized equipment for three different aspects of immersion. The HTC Vive HMD for visual and auditory representation, the Vive Trackers along with the haptic gloves for hand tracking and haptic feedback and finally the omnidirectional treadmill for locomotion. The application we designed was a virtual experience for discovering the historical artifact of the Antikythera Mechanism. It consisted of an underwater exploration of the shipwreck, where the mechanism was found, and a visit in the 1st century BC, where the mechanism was functional. The application was tested by 20 users, who then answered a questionnaire. The main reason for the survey was to study the level of immersion they experienced and evaluate the importance of the different aspects of immersion provided. Finally, we examined the user’s perspective about VR in cultural heritage.
This study not only provided insightful ideas into VR user experiences but also indicated the focus on further research. While it provided us with a fair impression of the important aspects of immersion, a comparative study to test the impact of our system should be of great importance. In Section 3.4 we already discussed how to better utilize the treadmill for our future implementations. Furthermore, greater use of the Vive Trackers could also improve our desired feel of immersion. Similar to [42], a visual representation for underwater drag force perception could enlarge the sense of presence in our shipwreck exploration scene. Finally, users suggested object-interactions as the primary feature of involvement in a virtual environment, so additional development should also focus on that. Overall, our system proved to be very useful in raising interest in learning about cultural heritage and such applications could definitely benefit cultural institutions.

Author Contributions

Conceptualization, A.C., K.K.; Methodology, A.C., K.K.; Software, A.C., K.K.; Validation, A.C., K.K.; Formal Analysis, A.C., K.K.; Investigation, A.C., K.K.; Resources, A.C., K.K.; Data Curation, A.C., K.K.; Writing—Original draft preparation, A.C., K.K.; Writing—review and editing, A.C., K.K., K.M.; Visualization, A.C., K.K.; Supervision, K.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research has been co-financed by the European Regional Development Fund of the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call Special Actions in aquaticfarming-industrial materials-open innovation in culture (project code: T 6 Y B Π -00120- C U R V E ).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lee, H.; Jung, T.H.; tom Dieck, M.C.; Chung, N. Experiencing immersive virtual reality in museums. Inf. Manag. 2020, 57, 103229. [Google Scholar] [CrossRef]
  2. Britain, S.; Liber, O. A Framework for Pedagogical Evaluation of Virtual Learning Environments. In JISC Technology Applications Programme. Report 41; 1999. Available online: https://files.eric.ed.gov/fulltext/ED443394.pdf (accessed on 26 October 2021).
  3. Weller, M. Virtual Learning Environments: Using, Choosing and Developing Your VLE; Routledge: Abingdon, UK, 2007. [Google Scholar]
  4. Drossis, G.; Birliraki, C.; Stephanidis, C. Interaction with immersive cultural heritage environments using virtual reality technologies. In Proceedings of the International Conference on Human-Computer Interaction, Las Vegas, NV, USA, 15–20 July 2018; pp. 177–183. [Google Scholar]
  5. Moustakas, K.; Strintzis, M.G.; Tzovaras, D.; Carbini, S.; Bernier, O.; Viallet, J.E.; Raidt, S.; Mancas, M.; Dimiccoli, M.; Yagci, E.; et al. Masterpiece: Physical interaction and 3D content-based search in VR applications. IEEE MultiMedia 2006, 13, 92–100. [Google Scholar] [CrossRef]
  6. Kersten, T.P.; Tschirschwitz, F.; Deggim, S.; Lindstaedt, M. Virtual reality for cultural heritage monuments—From 3d data recording to immersive visualisation. In Proceedings of the Euro-Mediterranean Conference, Nicosia, Cyprus, 29 October–3 November 2018; pp. 74–83. [Google Scholar]
  7. Paladini, A.; Dhanda, A.; Reina Ortiz, M.; Weigert, A.; Nofal, E.; Min, A.; Gyi, M.; Su, S.; Van Balen, K.; Santana Quintero, M. Impact of virtual reality experience on accessibility of cultural heritage. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 929–936. [Google Scholar] [CrossRef] [Green Version]
  8. Cao, D.; Li, G.; Zhu, W.; Liu, Q.; Bai, S.; Li, X. Virtual reality technology applied in digitalization of cultural heritage. Clust. Comput. 2019, 22, 10063–10074. [Google Scholar]
  9. Tsita, C.; Satratzemi, M. A serious game design and evaluation approach to enhance cultural heritage understanding. In Proceedings of the International Conference on Games and Learning Alliance, Athens, Greece, 27–29 November 2019; pp. 438–446. [Google Scholar]
  10. Volkmar, G.; Wenig, N.; Malaka, R. Memorial Quest-A Location-based Serious Game for Cultural Heritage Preservation. In Proceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play Companion Extended Abstracts, Melbourne, Australia, 28–31 October 2018; pp. 661–668. [Google Scholar]
  11. Luigini, A.; Basso, A. Heritage education for primary age through an immersive serious game. In From Building Information Modelling to Mixed Reality; Springer: Cham, Switzerland, 2021; pp. 157–174. [Google Scholar]
  12. Čejka, J.; Zsíros, A.; Liarokapis, F. A hybrid augmented reality guide for underwater cultural heritage sites. Pers. Ubiquitous Comput. 2020, 24, 1–14. [Google Scholar] [CrossRef]
  13. Bruno, F.; Lagudi, A.; Barbieri, L.; Muzzupappa, M.; Mangeruga, M.; Cozza, M.; Cozza, A.; Ritacco, G.; Peluso, R. Virtual reality technologies for the exploitation of underwater cultural heritage. In Latest Developments in Reality-Based 3D Surveying and Modelling; Remondino, F., Georgopoulos, A., González-Aguilera, D., Agrafiotis, P., Eds.; MDPI: Basel, Switzerland, 2018; pp. 220–236. [Google Scholar]
  14. Ruddle, R.A.; Lessels, S. For efficient navigational search, humans require full physical movement, but not a rich visual scene. Psychol. Sci. 2006, 17, 460–465. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Frommel, J.; Sonntag, S.; Weber, M. Effects of controller-based locomotion on player experience in a virtual reality exploration game. In Proceedings of the 12th International Conference on the Foundations of Digital Games, Hyannis, MA, USA, 14–17 August 2017; pp. 1–6. [Google Scholar]
  16. Schäfer, A.; Reis, G.; Stricker, D. Controlling Teleportation-Based Locomotion in Virtual Reality with Hand Gestures: A Comparative Evaluation of Two-Handed and One-Handed Techniques. Electronics 2021, 10, 715. [Google Scholar] [CrossRef]
  17. Caggianese, G.; Capece, N.; Erra, U.; Gallo, L.; Rinaldi, M. Freehand-steering locomotion techniques for immersive virtual environments: A comparative evaluation. Int. J. Hum.-Comput. Interact. 2020, 36, 1734–1755. [Google Scholar] [CrossRef]
  18. Nilsson, N.C.; Serafin, S.; Steinicke, F.; Nordahl, R. Natural walking in virtual reality: A review. Comput. Entertain. (CIE) 2018, 16, 1–22. [Google Scholar] [CrossRef]
  19. Cherni, H.; Nicolas, S.; Métayer, N. Using virtual reality treadmill as a locomotion technique in a navigation task: Impact on user experience–case of the KatWalk. Int. J. Virtual Real. 2021, 21, 1–14. [Google Scholar] [CrossRef]
  20. Oh, K.; Stanley, C.J.; Damiano, D.L.; Kim, J.; Yoon, J.; Park, H.S. Biomechanical evaluation of virtual reality-based turning on a self-paced linear treadmill. Gait Posture 2018, 65, 157–162. [Google Scholar] [CrossRef] [PubMed]
  21. Peruzzi, A.; Zarbo, I.R.; Cereatti, A.; Della Croce, U.; Mirelman, A. An innovative training program based on virtual reality and treadmill: Effects on gait of persons with multiple sclerosis. Disabil. Rehabil. 2017, 39, 1557–1563. [Google Scholar] [CrossRef]
  22. Rose, T.; Nam, C.S.; Chen, K.B. Immersion of virtual reality for rehabilitation—Review. Appl. Ergon. 2018, 69, 153–161. [Google Scholar] [CrossRef]
  23. Kreimeier, J.; Hammer, S.; Friedmann, D.; Karg, P.; Bühner, C.; Bankel, L.; Götzelmann, T. Evaluation of different types of haptic feedback influencing the task-based presence and performance in virtual reality. In Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Rhodes, Greece, 5–7 June 2019; pp. 289–298. [Google Scholar]
  24. Hosseini, M.; Sengül, A.; Pane, Y.; De Schutter, J.; Bruyninck, H. Exoten-glove: A force-feedback haptic glove based on twisted string actuation system. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 320–327. [Google Scholar]
  25. Moustakas, K.; Nikolakis, G.; Tzovaras, D.; Strintzis, M.G. A geometry education haptic VR application based on a new virtual hand representation. In Proceedings of the IEEE Proceedings. VR 2005. Virtual Reality, Bonn, Germany, 12–16 March 2005; pp. 249–252. [Google Scholar]
  26. Butt, A.L.; Kardong-Edgren, S.; Ellertson, A. Using game-based virtual reality with haptics for skill acquisition. Clin. Simul. Nurs. 2018, 16, 25–32. [Google Scholar] [CrossRef] [Green Version]
  27. CURVE. Available online: https://curve.gr/en/ (accessed on 26 October 2021).
  28. Efstathiou, K.; Efstathiou, M. Celestial Gearbox. Mech. Eng. 2018, 140, 31–35. [Google Scholar] [CrossRef] [Green Version]
  29. Kaltsas, N.E.; Vlachogianni, E.; Bouyia, P. The Antikythera Shipwreck: The Ship, the Treasures, the Mechanism; National Archaeological Museum, April 2012–April 2013; Kapon Editions: Athens, Greece, 2013. [Google Scholar]
  30. Epic Games. Unreal Engine. 2019. Available online: https://www.unrealengine.com (accessed on 26 October 2021).
  31. Mastrocinque, A. The Antikythera shipwreck and Sinope’s culture during the Mithridatic wars. In Mithridates VI and the Pontic Kingdom; Jojte, J.M., Ed.; Aarhus Univeristy PRESS: Aarhus, Denmark, 2009; pp. 313–319. [Google Scholar]
  32. Kougianos, G.; Moustakas, K. Large-scale ray traced water caustics in real-time using cascaded caustic maps. Comput. Graph. 2021, 98, 255–267. [Google Scholar] [CrossRef]
  33. Slater, M.; Usoh, M.; Steed, A. Taking steps: The influence of a walking technique on presence in virtual reality. ACM Trans.-Comput.-Hum. Interact. TOCHI 1995, 2, 201–219. [Google Scholar] [CrossRef]
  34. Cannavò, A.; Calandra, D.; Pratticò, F.G.; Gatteschi, V.; Lamberti, F. An evaluation testbed for locomotion in virtual reality. IEEE Trans. Vis. Comput. Graph. 2020, 27, 1871–1889. [Google Scholar] [CrossRef] [PubMed]
  35. Cakmak, T.; Hager, H. Cyberith Virtualizer: A Locomotion Device for Virtual Reality. In Proceedings of the ACM SIGGRAPH 2014 Emerging Technologies, Vancouver, BC, Canada, 10–14 August 2014; p. 1. [Google Scholar]
  36. Pakzad, A.; Iacoviello, F.; Ramsey, A.; Speller, R.; Griffiths, J.; Freeth, T.; Gibson, A. Improved X-ray computed tomography reconstruction of the largest fragment of the Antikythera Mechanism, an ancient Greek astronomical calculator. PLoS ONE 2018, 13, e0207430. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Freeth, T.; Jones, A. The Cosmos in the Antikythera Mechanism; ISAW Papers; ISAW: New York, NY, USA, 2012. [Google Scholar]
  38. Witmer, B.G.; Singer, M.J. Measuring presence in virtual environments: A presence questionnaire. Presence 1998, 7, 225–240. [Google Scholar] [CrossRef]
  39. Albert, J.; Sung, K. User-centric classification of virtual reality locomotion. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology, Tokyo, Japan, 28 November–1 December 2018; pp. 1–2. [Google Scholar]
  40. Kreimeier, J.; Götzelmann, T. First steps towards walk-in-place locomotion and haptic feedback in virtual reality for visually impaired. In Proceedings of the Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–6. [Google Scholar]
  41. Han, S.; Kim, J. A study on immersion of hand interaction for mobile platform virtual reality contents. Symmetry 2017, 9, 22. [Google Scholar] [CrossRef] [Green Version]
  42. Kang, H.; Lee, G.; Han, J. Visual manipulation for underwater drag force perception in immersive virtual environments. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 38–46. [Google Scholar]
Figure 1. The underwater scene populated with seabed elements and objects that represent the findings on the site of the wreck: (a) The user must reach the wreck to find the fragment and go back in time. (b) When the user closes up to certain objects, they are given information about the wreck.
Figure 1. The underwater scene populated with seabed elements and objects that represent the findings on the site of the wreck: (a) The user must reach the wreck to find the fragment and go back in time. (b) When the user closes up to certain objects, they are given information about the wreck.
Applsci 11 11613 g001
Figure 2. Turning the wheel with the haptic glove. When the user closes their hand around the handle, vibrations indicating the grabbing movement allow the user to make a natural movement in order to turn the wheel.
Figure 2. Turning the wheel with the haptic glove. When the user closes their hand around the handle, vibrations indicating the grabbing movement allow the user to make a natural movement in order to turn the wheel.
Applsci 11 11613 g002
Figure 3. Some of the elements the user can interact with: (a) This is the steering wheel the user can grab and navigate the ship (b) The reconstructed Antikythera mechanism where the user can rotate the handle and learn about its mechanics.
Figure 3. Some of the elements the user can interact with: (a) This is the steering wheel the user can grab and navigate the ship (b) The reconstructed Antikythera mechanism where the user can rotate the handle and learn about its mechanics.
Applsci 11 11613 g003
Figure 4. Our whole system utilizing the HTC Vive HMD, Cyberith’s Virtualizer treadmill, haptic gloves and Vive Trackers.
Figure 4. Our whole system utilizing the HTC Vive HMD, Cyberith’s Virtualizer treadmill, haptic gloves and Vive Trackers.
Applsci 11 11613 g004
Figure 5. The two panels of the mechanism. (a) Front Panel. (b) Back Panel.
Figure 5. The two panels of the mechanism. (a) Front Panel. (b) Back Panel.
Applsci 11 11613 g005
Figure 6. Reproduction of the gear train and displays of the Antikythera Mechanism.
Figure 6. Reproduction of the gear train and displays of the Antikythera Mechanism.
Applsci 11 11613 g006
Figure 7. The gears inside the mechanism.
Figure 7. The gears inside the mechanism.
Applsci 11 11613 g007
Table 1. The Antikythera Mechanism: known gears and formulas (cw: clockwise, ccw: counterclockwise).
Table 1. The Antikythera Mechanism: known gears and formulas (cw: clockwise, ccw: counterclockwise).
Gear NameFunction of the Gear/PointerMechanism FormulaGear
Direction
xYear gear1 (by definition)cw
bthe Moon’s orbitTime(b) = Time(x) × (c1/b2) × (d1/c2) × (e2/d2) × (k1/e5) × (e6/k2) × (b3/e1)cw
rlunar phase displayTime(r) = 1/((1/Time(b2 [mean sun] or sun3 [true sun])) − (1/Time(b)))
n*Metonic pointerTime(n) = Time(x) × (l1/b2) × (m1 /l2) × (n1/m2)ccw
o*Games dial pointerTime(o) = Time(n) × (o1/n2)cw
q*Callippic pointerTime(q) = Time(n) × (p1/n3) × (q1 /p2)ccw
e*lunar orbit precessionTime(e) = Time(x) × (l1/b2) × (m1/l2) × (e3/m3)ccw
g*Saros cycleTime(g) = Time(e) × (f1/e4) × (g1/f2)ccw
i*Exeligmos pointerTime(i) = Time(g) × (h1/g2) × (i1/h2)ccw
The following are proposed gearing from the 2012 Freeth and Jones reconstruction:
sun3*True sun pointerTime(sun3) = Time(x) × (sun3/sun1) × (sun2/sun3)cw
mer2*Mercury pointerTime(mer2) = Time(x) × (mer2/mer1)cw
ven2*Venus pointerTime(ven2) = Time(x) × (ven1/sun1)cw
mars4*Mars pointerTime(mars4) = Time(x) × (mars2/mars1) × (mars4/mars3)cw
jup4*Jupiter pointerTime(jup4) = Time(x) × (jup2/jup1) × (jup4/jup3)cw
sat4*Saturn pointerTime(sat4) = Time(x) × (sat2/sat1) × (sat4/sat3)cw
Table 2. Level of Immersion.
Table 2. Level of Immersion.
Evaluate the Level of Control and Immersion You Perceived, during Your Experience.Mean
Value
Standard
Deviation
How aware were you of events occurring in the real world around you?2.701.08
How much were you able to control events in the virtual world?3.900.72
How responsive was the environment to actions that you initiated (or performed)?3.850.88
How natural did your interactions with the environment seem?3.750.64
How much did the visual aspects of the environment contribute to your overall immersion?4.650.49
How much did the auditory aspects of the environment contribute to your overall immersion?4.151.04
How natural was the mechanism which controlled movement through the environment?3.650.81
How compelling was your sense of moving around inside the virtual environment?3.800.77
How much did your experiences in the virtual environment seem consistent with your real-world experiences?3.451.10
How involved were you in the virtual environment experience?4.300.66
Table 3. Aspects for Immersion.
Table 3. Aspects for Immersion.
Evaluate the Importance of the Following for
a More Immersive VR Experience.
Mean ValueStandard Deviation
Haptic Feedback3.900.85
Locomotion through the treadmill3.751.02
Interacting with objects in the scenes4.400.75
Table 4. Virtual Reality for Cultural Heritage.
Table 4. Virtual Reality for Cultural Heritage.
Virtual Reality for Cultural HeritageMean
Value
Standard
Deviation
I think that my interest in courses and educational content
would be higher if interactive content and VR systems were used.
4.450.83
It is easier to remember a historic fact if it is visualized.4.600.75
I believe that VR systems could be utilized in other fields
(science, art etc.).
4.700.66
After this experience I am more intrigued about learning
about cultural heritage.
4.000.86
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chrysanthakopoulou, A.; Kalatzis, K.; Moustakas, K. Immersive Virtual Reality Experience of Historical Events Using Haptics and Locomotion Simulation. Appl. Sci. 2021, 11, 11613. https://doi.org/10.3390/app112411613

AMA Style

Chrysanthakopoulou A, Kalatzis K, Moustakas K. Immersive Virtual Reality Experience of Historical Events Using Haptics and Locomotion Simulation. Applied Sciences. 2021; 11(24):11613. https://doi.org/10.3390/app112411613

Chicago/Turabian Style

Chrysanthakopoulou, Agapi, Konstantinos Kalatzis, and Konstantinos Moustakas. 2021. "Immersive Virtual Reality Experience of Historical Events Using Haptics and Locomotion Simulation" Applied Sciences 11, no. 24: 11613. https://doi.org/10.3390/app112411613

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop