You are currently viewing a new version of our website. To view the old version click .
Applied Sciences
  • Article
  • Open Access

28 April 2022

Interactive Exploration of Virtual Heritage by Means of Natural Gestures

,
,
,
and
1
Department of Mathematics and Computer Science, Ovidius University of Constanta, 900527 Constanta, Romania
2
Department of Design Engineering and Robotics, Technical University of Cluj-Napoca, 400020 Cluj-Napoca, Romania
*
Authors to whom correspondence should be addressed.
This article belongs to the Special Issue Virtual Reality and Its Application in Cultural Heritage II

Abstract

This paper is situated at the intersection of using Virtual Reality as a tool for cultural heritage preservation and using gesture interaction-based technology in order to achieve touchless, distant interaction of users with reconstructed artifacts. Various studies emphasize the positive effect on the cultural experience brought on by the use of Virtual Reality in a museum context. We build our approach on this idea, by modeling and reconstructing museum exhibits, both small artifacts and large architectural edifices. We propose and design navigation and interaction scenarios, at the same time taking into account present day limitations regarding social interaction, imposed during the COVID-19 pandemic. By considering the user in the center of the experience and focusing on enabling him/her to adjust the perspective on the visualized artifacts and to freely interact with them through natural gestures, we allow the user to immerse in the virtual environment and interact with the reconstructed artifacts by means of simple hand gestures, with no touch. Finally, we assess the usability and utility of the Virtual Reality system in a questionnaire-based study with 137 participants over a period of 6 months, whose results we discuss in the paper.

1. Introduction

Honoring the legacy of our ancestors, inherited in the form of physical artifacts, is a duty to both to their memory as well as to our successors. Cultural Heritage encompasses the legacy in terms of artifacts, such as sculptures, jewelry, tools, mosaics, paintings, manuscripts as well as museums, monumental buildings or archaeological remains. While visiting traditional museums has been a favorite cultural activity for ages, we have been witnessing a shift towards incorporating technological advancements in museum settings []. The use of digital technologies in cultural heritage-related issues has provided significant enhancements of the overall experience []. Digitization in the field of cultural heritage works towards ensuring immortality of cultural monuments and collections, which are so vulnerable in times of war or natural disasters [].
The COVID-19 pandemic brought important changes in all aspects of life. People faced various situations that were previously inconceivable. Special measures, such as social distancing, made us rethink and redesign processes that previously involved close encounters among persons. The cultural life of the majority of people had to take a step back and many cultural events resorted to being fully online [,]. Social interactions were avoided at all costs, for extended periods of time, and people are slowly starting to get accustomed to being around others. In this context, we tackle the issue of digitally enhanced museums, endowed with Virtual Reality (VR) applications and various media that allow a rich interaction of users with the exposed artifacts.
Virtual Reality is a mature field of research, with many applications in the educational field or in the Cultural Heritage []. As a technology, it has been in the worldwide spotlight recently, with Mark Zuckerberg’s launch of the Meta platform (Mark Zuckerberg, “Facebook’s VR future: New sensors on Quest Pro, fitness and a metaverse for work”, https://www.cnet.com/tech/gaming/features/mark-zuckerberg-on-facebook-vr-future-new-sensors-on-quest-pro-fitness-and-a-metaverse-for-work/ (accessed on 24 February 2022)). Special hardware, such as the Oculus Quest 2 VR headset (“Zuckerberg wants Facebook to become online ‘metaverse’”, https://www.bbc.com/news/technology-57942909 (accessed on 24 February 2022)), has been designed in order to help users achieve full immersion into the virtual environment. A great disadvantage of this kind of devices is seen when considering their use in a public setting, such as a museum: in today’s worries surrounding the pandemic, the use of special Head Mounted Displays (HMD) would require special disinfection policies and users may be reluctant to use them altogether, despite the special care that would be taken to ensure proper disinfection []. In this context, we focused on using, in our approach, touchless interaction with the virtual scene, by means of the Leap Motion (Leap Motion Controller, https://www.ultraleap.com/product/leap-motion-controller/ (accessed on 22 March 2022)) device and common hand movements.
Museum visitor’s interaction with modern digital technologies, such as gesture-based Virtual or Augmented Reality (AR), is often regarded with skepticism [,,,,,]. This kind of experiences is still unfamiliar to most regular users. When taking into account various age gaps and various technological backgrounds of museum visitors, the willingness of users to use such technological solutions and, thus, their acceptance by the users, is questionable.
Our work lies in the field of enhancing cultural heritage-related activities in museums by means of VR software solutions, while adjusting the experience to current limitations regarding social interactions. We wish to offer users the possibility to explore (or experiment) the history contained in the artifacts displayed in the museum beyond their present physical appearance, offering an unforgettable experience mediated by modern interactive technologies. We aim to deliver a sense of touch of the artifacts that are usually encased in glass boxes or may by fragile or too big, altogether, to be exposed in a museum setting. Our solution makes use of natural hand gesture-based interaction, in a desire to reduce physical contact among different persons to a minimum, thus following pandemic recommendations of social distancing and limited contact.
In this study, we present our experience with developing a VR system that uses natural interaction with the artifacts in a museum by hand gestures. We explore several research questions (RQ), in order to assess the degree of usability and usefulness of the presented approach:
  • Research Question 1 (RQ1): How is natural interaction by means of hand gestures perceived by the users?
  • Research Question 2 (RQ2): Do visitors consider that the gesture-based VR system contributes to the attractiveness of the museum?
  • Research Question 3 (RQ3): Are visitors satisfied with using the VR system implemented in the museum?
  • Research Question 4 (RQ4): Does the interaction with the system spark user interest towards VR/AR applications?
To investigate the opinions of users with respect to the research questions, we conducted a user study with 137 participants among the museum visitors, during a period of 6 months.
The paper is organized as follows. First, we provide an overview of some recent works in the field of VR applications and cultural heritage. The next section is devoted to the detailed description of the design and functionality of the VR systems implemented in the two museums. We take special emphasis on explaining the navigation metaphor. We aim to assess the usability and usefulness of the developed systems, as they result from our study involving a quantitative research methodology. We evaluate the perceived ease of use and usefulness of the developed VR systems, and debunk the skepticism that surrounds the acceptance of new technologies among certain age groups.

3. Materials and Methods

Our study delves into the details of the design, implementation, and actual setup of a VR system that uses gesture-based NI with virtual artifacts in a museum setting. In the following, we document each phase of our work. The users of the system may interact with virtual replicas of museum artifacts, as well as navigate through 3D reconstructions of historical buildings and their surroundings. The solution is deployed in the Callatis Museum in Mangalia, Romania, and was evaluated in a user-based study over a period of 6 months involving 137 participants among the roughly 250 visitors guided by a total of 5 staff members working in shifts.

3.1. The Development Insights

The steps undertaken during the development process of the VR gesture-based system are mentioned in Figure 1. The system drew its roots from the public expectations for such a system, collected in fruitful discussions with museum staff (Figure 1a) and regular museum visitors (Figure 1b). Following the requirement analysis for the system, we started to prepare the virtual content. During this phase, we scanned the movable cultural assets, then, we refined and textured the 3D models, and we obtained 3D replicas of architectural artifacts of the ancient temple (Figure 1c).
Figure 1. Opening metadata (a) to public expectation (b) based on virtual replica of cultural heritage (c) through interactive VR application (d).
The next development phase was concerned with the design and development of the VR application. We took on a software engineering approach, first designing the application, then implementing the visualization module, the animation module, and the interaction module (Figure 1d). The modules were integrated and the system went through validation tests.
In the final phase of our project, the system was deployed in situ at the Callatis Museum of History and Archeology, in Mangalia, Romania, where we first installed the hardware system and then made the system operational, calibrating and testing it in real museum conditions. The museum staff was trained to use it in good condition, such that they further explained the functionality of the system to museum visitors. For the testing phase, we wrote user manuals that document the proper usage of the system, and we provided several videos depicting several scenarios for the interaction of users with the 3D replicas by means of hand gestures, as well as for the navigation inside the reconstructed temple. The user manuals are readily available for both museum staff and visitors, and the videos are run in a loop on displays on the walls of the exhibition.

3.1.1. Virtual Content Preparation

The 3D digitization process made use of a handheld structured-light scanner (Creaform Go!Scan 50) (Creaform, https://www.creaform3d.com/ (accessed on 28 March 2022)), to acquire both the geometry and the texture of the artefacts, and architectural elements from Callatis Museum. The 3D scanning has been done directly in the location where the cultural heritage assets are positioned within the museum exhibition—this is one of the advantages of the handheld structured-light scanner. Figure 2 presents the 3D scanning of both artifacts and architectural elements. In total, 22 artifacts and 6 architectural elements have been digitized. For the large architectural elements that are positioned with their back to the wall, only the front side has been 3D scanned and the back has been filled in order to obtain closed bodies as 3D virtual models.
Figure 2. Snapshots captured during the 3D scanning process using structured light (a) indoor, (b) outdoor environments.
In what concerns temple constructive elements, all of them were modelled using 3D Max, 2019 license (Autodesk 3DS Max, https://www.autodesk.com/campaigns/3ds-max (accessed on 22 March 2022)), on the basis of the archaeological data provided by our partner, Callatis Museum (Table 1). Scanned replicas were also filtered using 3D Max, in order to be transformed in FBX 2019 binary format (FBX format, https://www.autodesk.com/products/fbx/overview (accessed on 22 March 2022)) including media information (so-called texture, shadows, etc.). In the exhibition, we put together a hypothetical setup validated by the museum staff.
Table 1. Excerpt from virtual replicas of cultural heritage artifacts list.

3.1.2. Interactive VR-Based Visualization Solution

All the artifact 3D replicas (except for some of the architectural basis elements, such as architrave, doric frieze, fragment ceiling geison) were further augmented by semantic/historical content, before being used in the interactive visualization solution. To this end, existing Supplementary Information focused on the place of discovery, the description, the dimensions was stored in JSON format (excerpt below) in order to be easily used by other applications (ECMA-404—the JSON data interchange, https://www.json.org/json-en.html (accessed on 22 March 2022)).
Applsci 12 04452 i003
The software application itself was implemented using GoDOT game engine version 3.4.2 (Godot Engine, https://godotengine.org/ (accessed on 22 March 2022)), GoDOT Leap Motion driver 1.1 from GODOT Asset Library (Godot Leapmotion Asset, https://godotengine.org/asset-library/asset/215 (accessed on 31 March 2022)), together with LeapMotion driver version 4.0.0+52173 (Leap Motion Controller, https://www.ultraleap.com/product/leap-motion-controller/ (accessed on 22 March 2022)), on Windows 10 (Windows 10, https://www.microsoft.com/ro-ro/software-download/windows10 (accessed on 22 March 2022)). In order to optimize its deployment in the partner museum, we chose to pack 3D models separately (one pack for each artifact, and one pack for the temple) and keep the textual description of all of the virtual artifacts open to further updates, outside of their corresponding packs (as test files, easy to be read and written by the museum personnel).
The VR software system is referred to, in the following, as the In Situ Visualization System (ISVS). It is composed of two components, ISVS-A, and ISVS-B, which are similar in implementation, but differ in their purpose within the interactive exhibition. A detailed presentation of both components is presented in Section 3.2.

3.1.3. Application Deployment in Real Environment

In order to install the systems and put them into operation in situ, at the museum, the team had to adapt the systems from the laboratory conditions to the real working conditions. Thus, it was necessary to analyze the following aspects:
  • Effective arrangement of equipment in the exhibition, with an impact on the technical elements of connection: network, audio, video signals;
  • Design and location of LeapMotion clamping and support systems;
  • Configuring the computer systems and adapting their layout so that the communication signals between LeapMotion and the application is not disturbed;
  • Designing the necessary elements for the interaction of the visitor with the In Situ Vizualisation Systems ISVS-A, and ISVS-B, within the exhibition.
After establishing all of the above, the VR system implementation in the exhibition was carried out in stages. A pilot version of ISVS-A equipment was installed for the beginning, with all the elements configured in the complete configuration. For this, we also designed and made a support for the Leap Motion device, adapted to the showcases of the exhibition, so as not to make a discordant note to the museum environment. After installing the ISVS-A component, we installed and configured the ISVS-B system. For this, it was necessary to design and make a support device for the ergonomic arrangement of LeapMotion.
The custom support for the Leap Motion sensor for the two proposed ISVS-A and ISVS-B system was manufactured using Fused Deposition Modeling (FDM) on a 3D printer. Our main purpose was to enclose the Leap Motion sensor and position the custom support at an optimal height and orientation according to the existing museum exhibition glass display cases and pillar supports. The 3D-printed parts paired with the aluminum profile facilitate a good cable management solution. The main components of each custom support system are illustrated in Figure 3 along with the positioning and orientation of the Leap Motion sensor.
Figure 3. Customed support interaction sensors design (a) ISVS-A system and (b) ISVS-B system.
These configurations have been validated, since the implementation and testing phase in situ, through the direct and effective interaction of the museum visitors. On this occasion, we received immediate feedback with respect to the usability of the system, mostly positive, along with suggestions for improvements. This confirmed that the proposed solution is suitable for the visiting museum public and is an important step forward in highlighting the museum components in a 3D format.
The system was installed in full configuration after completing these initial steps and validating them in situ.
The staff of the museum has limited knowledge in the field of operating computing technology; hence, this was a challenge in terms of the daily operation of equipment, hardware and software. This issue has been approached by implementing computer on/off procedures at the computer level, as follows:
  • At the BIOS component, the computer’s start time has been set, using PC’s BIOS or UEFI;
  • Start-up commands have been implemented for launching applications automatically;
  • At the end of the working hours, the computing systems set the closing time using the task scheduler mechanisms to set up regular shutdowns.
Through these mechanisms, we compensate for the low level of knowledge of the museum staff regarding the operation of computer systems and installed applications.

3.2. Technical Aspects

Our solution for interactive visualization dedicated for cultural heritage setups is suitable for two museum exhibit configurations. The first is adapted to visualize a large set or virtual artifacts organized in sets of 5 artifacts per visualization session (denoted by ISVS-A). The second one is dedicated to the visualization of an entire edifice, e.g., a temple (denoted by ISVS-B). While the software architectures of both modules are identical, at the hardware level, due to 3D virtual environment complexity, these two systems request slightly different setups as shown in Table 2 and Table 3.
Table 2. Hardware configuration of ISVS-A system.
Table 3. Hardware configuration of ISVS-B system.
Both ISVS-A (Figure 4) and ISVS-B (Figure 5) interactive visualization systems consist of a public viewing screen (Figure 4a and Figure 5a), a LeapMotion interaction sensor (Figure 4b and Figure 5b) which aims to detect the position, orientation, and posture of the palm of the user’s right hand, and, obviously, a process unit, in the form of a laptop (Figure 4c) or a desktop PC (Figure 5c), placed in the immediate vicinity of the sensor and the viewing screen.
Figure 4. ISVS-A’s hardware architecture (A)—front view—highlighting visualization display (indicated with (a)) and LeapMotion interaction sensor (indicated with (b)); (B) back view—highlighting the process unit (indicated with (c)).
Figure 5. ISVS-B’s hardware architecture—highlighting visualization display (indicated with (a)), LeapMotion interaction sensor (indicated with (b)), and the process unit (indicated with (c)).
Obviously, there is a software component, called ISVS-A and ISVS-B, respectively, that makes it possible to take over the information detected by the LeapMotion interaction sensor (Figure 4b and Figure 5b), analyze it at the level of the computing unit (Figure 4c and Figure 5c), and display a coherent visual response at user’s gestures via the display screen (Figure 4b and Figure 5b).

3.3. Follow My Hand—Touchless Interaction Metaphor

We faced multiple challenges in choosing the interaction metaphor. On the one hand, COVID-19 imposed a distance separation-based solution, that is, a touch-less interaction and physical distance of minimum 1.5 m between persons inside a small group of tourists and their guide. A second important constraint arises from the use of a single hand, the right one, and enriching its gestures with different semantics, such as “turn left”, “turn right”, “look up”, “look down”, “rotate to the right”, “rotate to the left”, “move forward”, and “move backwards” that applies on different virtual elements inside the 3D virtual environments, e.g., virtual artifacts or cameras.
In the following, we shall give some insights concerning how user gestures expressed in the real world are translated into the user avatar navigation inside the 3D virtual environment, virtual artifacts visualization and interaction with selected one.

3.3.1. Interaction Metaphor

Let us start with the simplest hand gesture, e.g., presenting the right hand in the sensor area and waving it horizontally (Figure 6a) in the case of ISVS-A system, used to visualize a set of virtual artifacts. This means that the entire scene will rotate in the sense indicated by the user’s hand (Figure 7a(a1,a2)). If the user moves the open hand to the left, the entire scene will rotate to the left, otherwise (e.g., the user moves the open hand to the right) the entire scene rotates to the right. If the user retracts his open hand from the sensor area, the application continues to present the artifacts, in a rotative manner, until another user intervenes or it stops according to the museum schedule. Next day, at the opening time, the application starts automatically to present another set of artifacts, arbitrarily selected from the museum collection (see a short demonstration of the ISVS-A system: https://youtu.be/x0CqLyYd8TQ (accessed on 20 April 2022)).
Figure 6. ISVS-A interaction gestures (a)—interaction with entire 3D scene; (b)—interaction with the central artifact.
Figure 7. ISVS-A interaction gestures (a)—interaction with entire 3D scene (a1) rotating the entire scene to the left, (a2) selecting an artifact; (b)—interaction with the central artifact (b1) grabbing the selected artifact, (b2) rotating the selected artifact.
However, once the user closes the hand (Figure 6b), the application changes its state from “artifacts presentation” to “artifact interaction” mode (Figure 8). The rotation of the entire scene stops, and the central (so-called selected) artifact starts rotating, according the closed hand waving/rotating direction (Figure 7b(b1,b2)). This time, if the user moves the closed hand to the left, the selected artifact will smoothly rotate to the left, otherwise (e.g., the user moves the closed hand at the right) the selected artifact rotates to the right. During all this time, while the user keeps the hand closed, an explanatory text slides in the superior part of the visualization display, from the right to the left. This behavior is described as a finite state machine in Figure 8.
Figure 8. ISVS-A’s state machine describing internal states of ISVS-A according to the user’s right- hand location and gesture.

3.3.2. Navigation Metaphor

For ISVS-B, we implemented a navigation metaphor, tackling the problem from various perspectives. The ISVS-B system implemented yet two other completely different visualization modes: one that corresponds to a completely autonomous “360° virtual tour” of the edifice (e.g., by default mode of ISVS-B) and another one that corresponds to a “first person” perspective, completely controlled by the museum visitor (e.g., controlled mode that supposes the user’s hand presence in the sensor area) (see a short demonstration of ISVS-B system: https://youtu.be/hT2ChW7w4x8 (accessed on 20 April 2022)).
In the default mode, the ISVS-B system presents a 360° virtual tour of a hypothetic temple located in Mangalia city of Romania (former Callatis Greek colony), situated on the Black Sea coast (former Pontus Euxinus—the original Latin name). The default camera is continuously focused on the temple and follows a closed path that turns several times around the temple at different altitudes. According to the camera’s altitude, the building details of the temple are revealed to the user. This way, the user obtains not only a general view of the edifice, but also a detailed interior one, that shows her/him each constructive element in its place and order (Figure 9).
Figure 9. The default view of the ISVS-B application: 360° virtual tour with a view of the constructive elements of the temple. No user interaction is required.
Once the user decides to “take the hand” on the camera, all that he/she has to do is to place the open right hand in the sensor area for at least 5 s, and from there, the system knows that there is user interested to interact with the system, so it changes the mode and camera to a “first person” one (Figure 10). The first-person camera starts to be controlled by the user only after the perspective is changed to “first person” and the user closes the right hand. If the user retracts his open right hand from the sensor area for more than 5 s, the system regains control on the camera and turns back in “360° virtual tour” mode.
Figure 10. User is taking control in ISVS-B application: default 360° virtual tour perspective (a); the user introduces the right open hand in the sensor area for less than 5 s (b); after 5 s, the camera is changed to a first-person one, (c) and the user is finally taking control of the camera by closing the right hand (d).
Once the user takes control of the first-person camera, he/she may move forward/backwards, turn the camera left/right/upwards/downwards. All these actions are schematically illustrated in Figure 11.
Figure 11. User’s movements in ISVS-B’s virtual environment using the closed right hand: moving forward/backward along Oz axis (a); turning the camera at left/right in xOz plane (b); orienting the camera in the vertical plane yOz upwards/downwards (c).
For the ISVS-B’s state machine describing internal states of ISVS-B according to the user’s right closed hand location and gesture, see Figure 12. Consequently, while the system is in the “first person” state (Figure 12), the user may experiment free navigation by moving his closed right hand along the Oz axis, by looking around her/him at left/right turning left/right the right hand or even to look up or down by orienting the hand upwards or downwards. If one of these right-hand movements stops, the first-person camera reacts accordingly, by passing in the state that fits with the right-hand posture and position.
Figure 12. ISVS-B’s state machine describing internal states of ISVS-B according to the user’s right- hand location and gesture: the internal state of the VR system changes, triggered by the actions of the user, expressed in terms of natural hand gestures.
Last but not least, a special internal state of free navigation mode appears in case an artifact appears in the camera’s field of view that makes the subject of temple construction, such as a column, or the head of a column, the architrave, the tympanum, the temple frieze, and so on. Regardless of the architectural element being placed in the edifice reconstruction or not, the ISVS-B system highlights it by a bounding box and displays useful information concerning it to the user (Figure 13).
Figure 13. ISVS-B’s state machine describing internal states of ISVS-B according to the user’s right- hand location and gesture.

3.4. User Study Design

The deployment of the systems in the Callatis Museum of Mangalia was followed by making it available to be readily used by museum visitors. Our goal was to investigate their impressions after interacting with digital cultural heritage assets, both artifacts and architectural edifice, by means of hand gestures.
Museum visitors were provided with descriptive user manuals to help them get familiarized with the conceptual framework of the system. The members of the stuff performed short demonstrations of the usage, in order to showcase the basic functionality of the systems. The participants were also presented with possible interactions with the system by means of a video which was playing in a loop on large screens inside the museum.
The intention was to have users perform freely tasks, without any help, then have them assess the usability of the applications. We did not impose a time limit for the interaction, nor a rigid set of tasks to be performed. At this point, we were interested in assessing the perceived usefulness of the system and the willingness of casual/regular museum visitors to use gesture-based VR technology.
Users were presented with a questionnaire at the end of their visit. The survey was created by the software development team and refined after being analyzed by museum employees at the Callatis Museum in Mangalia.
The items in the questionnaire were organized in three sections: first, demographic information is collected as variables on nominal scale, then, two sections dedicated to measuring usability, respectively, the degree of utility of the application and satisfaction of the users, composed of items measured on a five-point Likert scale, ranging from Strongly disagree (1) to Strongly agree (5) with the affirmation in the item. During the study, the survey was provided for users on paper, and it was also made available for museum visitors online, using the Google Forms platform (Google Forms platform, https://www.google.com/forms/about/ (accessed on 10 February 2022)), which provides a user-friendly interface and easy access to collected survey data. The survey is presented as Supplementary Materials for this paper.

4. Results

The Callatis Museum of Mangalia was the host for the interactive museum exhibition by means of our VR based software system, designed to integrate natural gesture interaction to allow visitors to interact with the digitized cultural heritage assets. All the questionnaires were filled on-site, among the visitors who were willing to express their impressions, during the study period August 2021–January 2022. Only the answers that contained responses to all the required questions were considered for this study.

4.1. Demographic Profile

We collected information from 137 participants, over the course of 6 months. The user group is heterogeneous, from the age perspective (see Figure 14 and Figure 15). We noticed that the age group composed of children under 11 years old represented less than 10%, as well as the one comprised of people over 65 years old whereas almost 58% of the participant were 19–50 years old. Most of the respondents were university graduates (40%), and 20% of the participants were only at most high-school graduates.
Figure 14. Demographic information: educational profile of users.
Figure 15. Demographic information: age profile of users.
The level of experience with natural gestures without touch interaction is depicted in Figure 16, with 1 representing “Never”, and 5 representing “Daily interaction”. Previous experience with VR/AR technologies is good in half of the respondents (with 1 representing “Zero experience”, and 5 representing “Experienced user”).
Figure 16. Demographic information: the level of previous experience with gesture based and/or VR/AR technologies.
Most respondents have visited the museum in visits organized by local schools and high-schools or other public institutions in the county, and most of them (118 respondents) had previously visited the museum. Participants were asked to express their impressions with respect to the improvements brought upon the museum by the implementation of the interactive exhibition, with responses collected on the same five-point scale, ranging from 1—“Weak” to 5—“Excellent”. All the visitors were fairly impressed with the interactive exhibition, with 72% rating the influence of the exhibition on the overall museum experience as Excellent. In addition, respondents were asked to express their Word-of-Mouth intentions regarding the popularization of information about the museum and its interactive exhibition to friends and acquaintances and the results are promising, as the majority provided affirmative answers.

4.2. Usability Evaluation

From the usability perspective, we wanted to asses the impressions of the participants on the quality and quantity of the information (Figure 17). Most visitors agreed that the system displayed consistent information, which has updated quickly in response to user actions (more than 65% strongly agreed on each of the items. An important issue was assessing user responses related to the natural gesture-based interaction, which is an important (or the central) aspect of the exhibition. To this end, we found that even though almost half of the participants (45%) consider that using the application requires physical effort, the majority found the application easy to use (77.37%) and 81% of them strongly agreed to enjoying the application usage overall (Figure 18).
Figure 17. Graphical representation for respondents’ answers to items regarding the quality and quantity of information.
Figure 18. Graphical representation for respondents’ answers to items regarding the effort used in the interaction.
A separate item inquired the level of agreement with the affirmations “Interaction with the application requires a high intellectual effort”. As we can notice in Figure 17, the majority of participants considered that the application did not necessitate the use of a high intellectual effort.

4.3. Utility and Satisfaction Evaluation

The last section of the questionnaire contained items that measured the degree of utility of the application. The application proved useful for creating a mental 3D image of the artifacts (Figure 19). The overall impressions of the visitors were appreciative of the application system that was installed and ready in the museum, considering that it made the museum more attractive (88%). Respondents expressed a Word-of-Mouth intention to use VR/AR technologies in the future (76%).
Figure 19. Graphical representation for the summary of users’ answers to questions related to their satisfaction level.
The majority of visitors agreed with the fact that the application helps promote the cultural heritage (82%) and aids people to develop historical knowledge (85%) (Figure 19). Given that many visitors were from educational facilities in the county, the fact that they strongly agreed in a large number (88%) indicates that the application is useful for educational purposes (Figure 20).
Figure 20. Graphical representation for the assessment of the utility of the application, as perceived by the participants of the study.
Personal satisfaction with the application reached high levels in the user sample, regardless of the educational background of the user (Figure 21).
Figure 21. Summary of answers to item “I find the application easy to use”, grouped by Education level. The answers ranged from 1—“Strongly disagree” to 5—“Strongly agree” with the statement.
The VR exhibition incited user interest for VR/AR applications, regardless of their age or educational background. The application presents educational advantages, aiding the process of understanding history in a very practical interactive manner. This study proves that the museum can benefit from such software systems in order to enrich the attractivity of their exhibitions.

5. Discussion and Limitations

Museum artifacts and cultural objects are usually subject to restrictions with respect to their handling, partly due to preservation reasons, their uniqueness, and also to their frailty or inaccessibility. The systems presented in this paper use Virtual Reality replicas of museum artifacts and allow users to interact with the 3D reconstructed objects using natural hand gestures.
We were concerned about the potential skepticism with which users might approach the application. As it results from the analysis above, the vast majority of users enjoyed using the application, finding it easy to be used. This result answers the first research question (RQ1) that we addressed in the study. With respect to RQ2, from analyzing the responses to the questionnaire presented in the previous section, we conclude that the implementation of our systems in the museum enhances the attractiveness of the museum (Figure 19).
The median score for the suitability of the system as a tool for the development of historical knowledge is 5, and the mode is also 5, thus expressing high satisfaction with respect to the historical education of audience, while the median agreement score with the item “The application allowed me to create a mental image of the visualized objects” was also 5, with a mode of 5, signifying an overwhelming strong agreement from the part of the users. The answers are consistent. The same median and mode values of 5 support the strong agreement with the item “The application is useful for promoting cultural heritage”. We extract from the user answers the conclusion that they are satisfied with the experience of using the system, in response to RQ3. Overall, almost 90% of the visitors strongly agreed to a rising interest towards using VR/AR systems in the future, thus answering RQ4.
To conclude, the research questions that we inquired were answered in a positive manner by the museum visitors that agreed to fill in questionnaires on the occasion of their visit to the museum.
The majority of the museum visitors that filled in questionnaires were more than 15 years old (90%), with 80% of visitors being adults, approximately 10% of users being over 50 years old. All the users in the “over 65 years” and in the “11–15 years” categories provided positive answers with respect to their personal satisfaction with using the app (Figure 22 and Figure 23). The small sample of users in these age ranges is, however, a limitation of our study. In a future study, it would be interesting to assess the acceptance of our system in a larger sample of young users (<15 years old), and likewise, in a larger sample of older users (>65 years old).
Figure 22. Summary of answers for the item “I find the application easy to use”, structured by age group. The answers ranged from 1—“Strongly disagree” to 5—“Strongly agree” with the statement.
Figure 23. Summary of answers for the item “I enjoyed using the app”, structured by age group. The answers ranged from 1—“Strongly disagree” to 5—“Strongly agree” with the statement.
From the point of view of the physical effort required to use the system, the opinions of the users were divided. The mean score of the responses to the item “I consider that the interaction with the system does not require too much physical effort” was 3.05, with the standard deviation of 1.91 (44.5% of users strongly agreed, and 44.5% of users strongly disagreed with the statement). Hence, even if the intellectual effort of using the system was deemed low, the physical effort may raise a concern.
The user answers to the open items revealed that some users considered that they should have devoted more time to the visit to the museum or that the location is too cold; one user was not pleased with the placement of the system in a too small place, and another user considered the exposed artifact to have “low brightness”. On the positive side, we received numerous appreciative comments, such as “I liked showcases, souvenirs, I learned new things about museum pieces”, while other user listed as positive aspects “Presentation mode, friendly staff, objects presented clearly and in detail”.

6. Conclusions and Future Work

Our study lies along the path of using new technologies in cultural heritage preservation, conservation, and dissemination. Our contribution consists in the comprehensible presentation of a full-scale approach of a complex software system that enables museum visitors to naturally interact with museum artifacts using basic hand gestures and immerse in a virtual reality 3D world that evokes the atmosphere of an ancient roman temple. The virtualization of the archaeological artifacts and remains in Callatis is described in the paper, as it was performed without any negative intervention upon them, ensuring their preservation in the museum, while making them readily available for the public manipulation, in a virtual sense. Our methodology may be easily extended to be used in other museums, for any kind of exhibits which are suitable to the scanned 3D models.
The users interact with artifacts in a natural fashion with ease, enjoying the process.
We report our findings from a questionnaire-based study of the usability of the system, in which we were also concerned with the degree of utility of the system, as perceived by the users that interacted with it, as well as their satisfaction with the system. Our analysis revealed that users see the system as a useful tool for learning history and for popularization of archaeological vestiges.
A direction for our research in the near future involves focusing on extending our solution with two-handed interaction metaphors for a single user. One possible application will enable the visitor to experience a personal cultural immersion by modeling virtual artifacts that extend the presented collection to the public. Later on, this metaphor will be further extended to encompass several users that interact in a collaborative task in the virtual environment, e.g., building a sword.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/app12094452/s1, Video S1: In Situ Visualization System (ISVS-A) Temple visualization title (mp4), Video S2: In Situ Visualization System (ISVS-B) 3D Scanned Artifacts Visualization (mp4), File S1: User survey (pdf).

Author Contributions

Conceptualization, D.-M.P., D.I. and R.C.; data curation, D.I., R.C., C.G.D.N. and E.B.; formal analysis, D.I. and E.B.; funding acquisition, D.-M.P., D.I., C.G.D.N. and E.B.; investigation, D.-M.P., R.C. and C.G.D.N.; methodology, D.I. and R.C.; project administration, D.-M.P.; resources, D.-M.P. and C.G.D.N.; software, D.-M.P.; supervision, D.-M.P.; validation, D.I., C.G.D.N. and E.B.; visualization, D.-M.P. and D.I.; writing—original draft, D.-M.P., R.C. and E.B.; writing—review and editing, D.-M.P., D.I., R.C., C.G.D.N. and E.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially funded by the project “A future for the past-classification and capitalization of Callatian epigraphic and sculptural heritage” (Unique project code CALL01-9), project funded by the EEA Grants 2014-2021 under the RO-CULTURE Program, contract no. RO-CULTURA-A1-4/2020/31.01.2020, within the framework of “Supporting innovative exhibitions with restored movable cultural assets” call. This work was also supported by the Romanian Ministry of Research and Innovation grant, CCCDI-UEFISCDI, project number TE 132 15/09/2020-PN-III-P1-1.1-TE-2019-2203 (Atop), within PNCDI III.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors thank to Callatis Museum from Mangalia, Romania, for providing historical scientific counseling, and their support in questionary completion.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Bekele, M.K.; Pierdicca, R.; Frontoni, E.; Malinverni, E.S.; Gain, J. A survey of augmented, virtual, and mixed reality for cultural heritage. J. Comput. Cult. Herit. JOCCH 2018, 11, 1–36. [Google Scholar] [CrossRef]
  2. Barsanti, S.G.; Caruso, G.; Micoli, L.L.; Covarrubias, M.; Guidi, G. 3D visualization of cultural heritage artefacts with virtual reality devices. In Proceedings of the 25th International CIPA Symposium, Taipei, Taiwan, 31 August–4 September 2015; Copernicus Gesellschaft GmbH: Göttingen, Germany, 2015; Volume 40, pp. 165–172. [Google Scholar]
  3. Shrestha, S.; Chakraborty, J.; Mohamed, M.A. A comparative pilot study of historical artifacts in a CAVE automatic virtual reality environment versus paper-based artifacts. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, Florence, Italy, 6–9 September 2016; pp. 968–977. [Google Scholar]
  4. Demertzis, N.; Ron, E. COVID-19 as cultural trauma. Am. J. Cult. Sociol. 2020, 8, 428–450. [Google Scholar] [CrossRef] [PubMed]
  5. Škorupová, M. Alternative activities during a pandemic period in the sphere of cultural institutions. EUREKA Soc. Humanit. 2020, 3, 20–26. [Google Scholar] [CrossRef]
  6. Chong, H.; Lim, C.; Ahmed, M.; Tan, K.; Mokhtar, M. Virtual reality usability and accessibility for cultural heritage practices: Challenges mapping and recommendations. Electronics 2021, 10, 1430. [Google Scholar] [CrossRef]
  7. Moore, N.; Dempsey, K.; Hockey, P.; Jain, S.; Poronnik, P.; Shaban, R.Z.; Ahmadpour, N. Innovation During a Pandemic: Developing a Guideline for Infection Prevention and Control to Support Education Through Virtual Reality. Front. Digit. Health 2021, 3, 74. [Google Scholar] [CrossRef]
  8. Komarac, T.; Đurđana, O.D. Discovering the determinants of museum visitors’ immersion into experience: The impact of interactivity, expectations, and skepticism. Curr. Issues Tour. 2021, 3, 1–19. [Google Scholar] [CrossRef]
  9. Zara, J. Virtual reality and cultural heritage on the web. In Proceedings of the 7th International Conference on Computer Graphics and Artificial Intelligence, Limoges, France, 12–13 May 2004; Volume 330. [Google Scholar]
  10. Disztinger, P.; Stephan, S.; Aleksander, G. Technology acceptance of virtual reality for travel planning. In Information and Communication Technologies in Tourism 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 255–268. [Google Scholar]
  11. Pujol-Tost, L. “3d Cod”: A New Methodology for the design of Virtual Reality-Mediated experiences in digital Archeology. Front. Digit. Humanit. 2017, 4, 16. [Google Scholar] [CrossRef]
  12. Ciurea, C.; Filip, F.G. Virtual exhibitions in cultural institutions: Useful applications of informatics in a knowledge-based society. Stud. Inform. Control 2019, 28, 55–64. [Google Scholar] [CrossRef]
  13. Peinado-Santana, S.; Patricia, H.-L.; Jorge, B.-L.; Beatriz, C.-A.; José, A.M.-C. Public Works Heritage 3D Model Digitisation, Optimisation and Dissemination with Free and Open-Source Software and Platforms and Low-Cost Tools. Sustainability 2021, 13, 13020. [Google Scholar] [CrossRef]
  14. Kargas, A.; Loumos, G.; Varoutas, D. Using different ways of 3D reconstruction of historical cities for gaming purposes: The case study of Nafplio. Heritage 2019, 2, 1799–1811. [Google Scholar] [CrossRef]
  15. Ashrafi, B.; Neugebauer, C.; Kloos, M. A Conceptual Framework for Heritage Impact Assessment: A Review and Perspective. Sustainability 2022, 14, 27. [Google Scholar] [CrossRef]
  16. Chung, N.; Inessa, T.; Seung, J.L. Eco-innovative museums and visitors’ perceptions of corporate social responsibility. Sustainability 2019, 11, 5744. [Google Scholar] [CrossRef]
  17. Pop, I.L.; Borza, A.; Buiga, A.; Ighian, D.; Toader, R. Achieving cultural sustainability in museums: A step toward sustainable development. Sustainability 2019, 11, 970. [Google Scholar] [CrossRef]
  18. Bachmann, D.; Weichert, F.; Rinkenauer, G. Evaluation of the Leap Motion Controller as a New Contact-Free Pointing Device. Sensors 2014, 15, 214–233. [Google Scholar] [CrossRef]
  19. Zhang, Z. Microsoft kinect sensor and its effect. IEEE Multimed. 2012, 19, 4–10. [Google Scholar] [CrossRef]
  20. Franczuk, J.; Boguszewska, K.; Parrinello, S.; Dell’Amico, A.; Galasso, F.; Gleń, P. Direct use of point clouds in real-time interaction with the cultural heritage in pandemic and post-pandemic tourism on the case of Kłodzko Fortress. Digit. Appl. Archaeol. Cult. Herit. 2022, 24, e00217. [Google Scholar] [CrossRef]
  21. Vosinakis, S.; Koutsabasis, P.; Makris, D.; Sagia, E. A Kinesthetic Approach to Digital Heritage Using Leap Motion: The Cycladic Sculpture Application. In Proceedings of the 2016 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-GAMES), Barcelona, Spain, 7–9 September 2016; pp. 1–8. [Google Scholar]
  22. Kyriakou, P.; Hermon, S. Can I touch this? Using Natural Interaction in a Museum Augmented Reality System. Digit. Appl. Archaeol. Cult. Herit. 2019, 12, e00088. [Google Scholar] [CrossRef]
  23. Manghisi, V.M.; Uva, A.E.; Fiorentino, M.; Gattullo, M.; Boccaccio, A.; Monno, G. Enhancing user engagement through the user centric design of a mid-air gesture-based interface for the navigation of virtual-tours in cultural heritage expositions. J. Cult. Herit. 2018, 32, 186–197. [Google Scholar] [CrossRef]
  24. Skublewska-Paszkowska, M.; Powroznik, P.; Smolka, J.; Milosz, M.; Lukasik, E.; Mukhamedova, D.; Milosz, E. Methodology of 3D Scanning of Intangible Cultural Heritage&mdash; The Example of Lazgi Dance. Appl. Sci. 2021, 11, 11568. [Google Scholar] [CrossRef]
  25. Okanovic, V.; Ivkovic-Kihic, I.; Boskovic, D.; Mijatovic, B.; Prazina, I.; Skaljo, E.; Rizvic, S. Interaction in eXtended Reality Applications for Cultural Heritage. Appl. Sci. 2022, 12, 1241. [Google Scholar] [CrossRef]
  26. Drossis, G.; Birliraki, C.; Stephanidis, C. Interaction with immersive cultural heritage environments using virtual reality technologies. In Communications in Computer and Information Science, Proceedings of the 20th International Conference, HCI International 2018, Las Vegas, NV, USA, 15–20 July 2018; Springer: Cham, Switzerland, 2018; pp. 177–183. [Google Scholar]
  27. Galdieri, R.; Marcello, C. Natural interaction in virtual reality for cultural heritage. In Communications in Computer and Information Science, Proceedings of the International Conference on VR Technologies in Cultural Heritage, Brasov, Romania, 29–30 May 2018; Springer: Cham, Switzerland, 2018; pp. 122–131. [Google Scholar]
  28. Trunfio, M.; Lucia, M.D.; Campana, S.; Magnelli, A. Innovating the cultural heritage museum service model through virtual reality and augmented reality: The effects on the overall visitor experience and satisfaction. J. Herit. Tour. 2022, 17, 1–19. [Google Scholar] [CrossRef]
  29. Li, Y.; Eugene, C.; Shengdan, C.; Simon, S. Multiuser interaction with hybrid VR and AR for cultural heritage objects. In Proceedings of the 2018 3rd Digital Heritage International Congress (Digital HERITAGE) Held Jointly with 2018 24th International Conference on Virtual Systems & Multimedia (VSMM 2018), San Francisco, CA, USA, 26–30 October 2018; pp. 1–8. [Google Scholar]
  30. Losada, N.; Jorge, F.; Teixeira, M.S.; Melo, M.; Bessa, M. Could virtual reality substitute the ‘real’experience? Evidence from a UNESCO world heritage site in Northern Portugal. In International Conference on Tourism, Technology and Systems; Springer: Singapore, 2020; pp. 153–161. [Google Scholar]
  31. Dias, P.; Luis, A.; Sérgio, E.; Beatriz, S.S. Mobile devices for interaction in immersive virtual environments. In Proceedings of the 2018 International Conference on Advanced Visual Interfaces, Grosseto, Italy, 29 May–1 June 2018; pp. 1–9. [Google Scholar]
  32. Marques, B.; Raphael, C.; Paulo, D.; Beatriz, S.S. Pervasive augmented reality for indoor uninterrupted experiences: A user study. In Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, London, UK, 9–13 September 2019; pp. 141–144. [Google Scholar]
  33. Madeira, T.; Marques, B.; Neves, P.; Dias, P.; Santos, B.S. Comparing Desktop vs. Mobile Interaction for the Creation of Pervasive Augmented Reality Experiences. J. Imaging 2022, 8, 79. [Google Scholar] [CrossRef] [PubMed]
  34. Monteiro, P.; Guilherme, G.; Hugo, C.; Miguel, M.; Maximino, B. Hands-free interaction in immersive virtual reality: A systematic review. IEEE Trans. Vis. Comput. Graph. 2021, 27, 2702–2713. [Google Scholar] [CrossRef] [PubMed]
  35. Marto, A.; Alexandrino, G.; Miguel, M.; Maximino, B. A survey of multisensory VR and AR applications for cultural heritage. Comput. Graph. 2022, 102, 426–440. [Google Scholar] [CrossRef]
  36. Neamțu, C.; Vitalie, B.; Zsolt, B. Promoting and Capitalizing on the Vestiges from Sarmizegetusa Regia by Modern Multimedia Methods. PLURAL Hist. Cult. Soc. 2020, 1, 150–173. [Google Scholar]
  37. Carrozzino, M.G.-D.; Voinea, M.; Duguleană, R.; Boboc, G.; Bergamasco, M. Comparing innovative xr systems in cultural heritage. A case study. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences, Milan, Italy, 8–10 May 2019. [Google Scholar]
  38. Calin, M.; Damian, G.; Popescu, T.; Manea, R.; Erghelegiu, B.; Salagean, T. 3D modeling for digital preservation of Romanian heritage monuments. Agric. Agric. Sci. Procedia 2015, 6, 421–428. [Google Scholar] [CrossRef][Green Version]
  39. Popovici, D.M.; Bogdan, C.M.; Polceanu, M.; Querrec, R. Applying of an Ontology based Modeling Approach to Cultural Heritage Systems. Adv. Electr. Comput. Eng. 2011, 11, 105–110. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.