Next Article in Journal
Ultra-Short Pulse Laser Cleaning of Contaminated Pleistocene Bone: A Comprehensive Study on the Influence of Pulse Duration and Wavelength
Previous Article in Journal
Protective Coatings for Metals in Scientific—Technical Heritage: The Collection of the Spanish National Museum of Science and Technology (MUNCYT)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Visiting Heritage Sites in AR and VR

by
Zacharias Pervolarakis
1,
Emmanouil Zidianakis
1,
Antonis Katzourakis
1,
Theodoros Evdaimon
1,
Nikolaos Partarakis
1,*,
Xenophon Zabulis
1 and
Constantine Stephanidis
1,2
1
Institute of Computer Science, Foundation for Research and Technology Hellas (ICS-FORTH), N. Plastira 100, Vassilika Vouton, 70013 Heraklion, Greece
2
Computer Science Department, University of Crete, Voutes Campus, 70013 Heraklion, Greece
*
Author to whom correspondence should be addressed.
Heritage 2023, 6(3), 2489-2502; https://doi.org/10.3390/heritage6030131
Submission received: 6 February 2023 / Revised: 20 February 2023 / Accepted: 21 February 2023 / Published: 24 February 2023

Abstract

:
Advances in digitization technologies have made possible the digitization of entire archaeological sites through a combination of technologies, including aerial photogrammetry, terrestrial photogrammetry, and terrestrial laser scanning. At the same time, the evolution of computer algorithms for data processing and the increased processing power made possible the combination of data from multiple scans to create a synthetic representation of large-scale sites. Finally, post-processing techniques and the evolution of computer and mobile GPUs and game engines have made possible the exploitation of digitization outcomes to further scientific study and historical preservation. This route was opened by the gaming industry. In terms of research, the exploitation of these new assets in conjunction with new visual rendering technologies, such as virtual and augmented reality, can create new dimensions for education and leisure. In this paper, we explore the usage of large-scale digitization of a heritage site to create a unique virtual visiting experience that can be accessed offline in VR and AR and on-site when visiting the archaeological site.

1. Introduction

Making large-scale heritage sites accessible through digital technology is still a major challenge today. In this work, the challenge was providing multimodal access to the digitization of the Palace of Knossos and its peripheral sites in collaboration with the Ephorate of Antiquities of Heraklion. Knossos is best known for its monumental palace, the so-called Minos Palace [1,2,3,4], excavated by Arthur Evans [4]. The archaeological site and its peripheral sites are composed of a complex structure that has multiple internal and external sites complemented with modern additions as part of the restoration process. The digitization of the site was a great challenge by itself, which resulted in a complete set of registered and fully exploitable 3D digital assets. In this work, we build on this digitization to offer novel means of exploration.
In this work, we use the term multimodal access as an umbrella term for the following 3D rendering and interaction scenarios: (a) standard interaction with a computer and mouse using GPU-based rendering, (b) VR headset and VR controllers for interactive navigation in the heritage site, and (c) mobile augmented reality that maps the position and orientation of the mobile phone with the virtual spaces and supports POI activation through touch interaction. The result of this work is an innovative, interactive digital tour guide [5] currently published for mobile app stores.

2. Related Work

The term virtual exhibition (VE) is used in the domain of digital cultural heritage (DCH) to describe a variety of technical solutions, interactions, and immersion styles. In the 2000s, the majority of VEs were web-based [6,7], and from the early 2010s, basic guidelines for creating interesting and compelling VEs were contributed [8,9,10]. In parallel, digital technology explored ways of enhancing the museum experience through on-site and mixed reality (MR) VEs [11,12,13], authoring environments for web-based virtual museums [14,15], and authoring web-based virtual environments to provide a synthetic representation of cultural heritage (CH) subjects including intangible dimensions [16,17].

2.1. Virtual Reality and Cultural Heritage

Using virtual reality technologies in the CH context is not new since several approaches have been proposed in the past two decades. Starting from CAVE-based virtual reality, researchers have proposed several approaches that include both immersive presentation through VR and haptic-based manipulation of heritage objects (e.g., [18,19]). The profound benefits of interacting with CH in VR gave birth to several new approaches that merged 3D reconstruction technologies with VR. By employing 3D reconstruction, realistic digital replicas of CH objects were implemented and integrated into VR experiences (e.g., [20,21]). In earlier approaches for CH presentation, digitization was not possible due to the immaturity of the technology, and technological restrictions of the rendering hardware schenes from archaeological sites were modeled from scratch in 3D (e.g., [22,23]). This, of course, resulted in lower-quality 3D models but enabled researchers to complement the reality of the heritage site (the structural remains) with digitally manufactured structures and, thus, provide a digital restoration of the monument (e.g., [24,25]). These works went even further by simulating the weather and daily life in ancient CH sites through the graphics-based rendering of nature and autonomous virtual humans. The evolution of VR devices with the emergence of commercial VR headsets and VR controllers greatly simplified the implementation of VR-based experiences (e.g., [26,27]). At the same time, 360 photography and 360 videos made possible another form of virtual reality through inexpensive VR headsets that could be mounted on smartphone devices. Such approaches were further augmented by including information points and interactive spots within 360 videos that could be activated using a more advanced interaction technology, such as an Oculus headset and controllers (e.g., [28,29,30,31]). Furthermore, studies focused on the resource-demanding task of streaming 360 videos in such headsets (e.g., [32,33]). From a sustainability perspective, VR is proposed as an alternative means of access to endangered CH sites that, due to visiting pressure, would benefit through the redirection of visits to digital media (e.g., [34]).

2.2. Augmented Reality and Cultural Heritage

AR has been the subject of continuous research throughout the years, and the algorithms used have kept evolving, thus contributing to its potential. AR research provides clues that it can enhance learning as a consequence of multiple key features that are otherwise missing from common educational means [35]. In the work of Irwansyah et al. [36], it was shown that learning enhanced with AR can increase the overall experience of students in school-related subjects, such as chemistry, or even strengthen the learning experience of young children in cultural heritage sites [37]. A fascinating work by M. Claudia et al. [38] explored the value of AR for cultural heritage sites using the stakeholder approach. By conducting an exploratory study on museum stakeholders, personnel, and focus groups, they reported that there are numerous perceived value dimensions of AR within the cultural heritage tourism context for stakeholders, including economic, experiential, social, epistemic, historical and cultural, and educational value.
Mobile AR research started by integrating feature extraction algorithms into mobile phones and using camera input for the acquisition of images (e.g., [39]). Other approaches used more advanced mobile devices (that used to be called PDAs) to augment digital scenes with more advanced information including virtual humans (e.g., [40]). More recent approaches employed the increased processing power provided by modern mobile phones to provide various forms of AR, such as augmentation of the images of the mobile device camera with information (e.g., [41,42]) that includes the interpolation of 3D digitizations with the camera input (e.g., [43]). Other approaches blend the virtual and the digital by replacing the physical remains of a heritage site with a digitally enhanced version of the site at the time of its creation ([44]). Last but not least, physical objects have been used to support the visualization and interaction with archaeological artifacts in AR (e.g., [45,46]).
The wide availability of mobile devices in the context of CH has opened a new world of opportunities and expanded its usage in other contexts, such as in the domain of teaching tangible and intangible CH (e.g., [47,48,49]).

2.3. Mixing Augmented and Virtual Reality

In the last few years, mobile devices have been able to support larger and computationally heavy AR scenes, supporting a new trend called “AR Portals”. The concept of an AR Portal application is that, by using the currently available AR features that are supported in a mobile device such as plane detection, the user can spawn a portal (or door) to another world and, by walking through the portal, is transported into that world. After being transported, the user can roam and freely explore the world by moving and rotating the mobile device. A great application called “The Historical Figures AR” allows its users to walk through a portal and visit multiple sites of historical importance, including Albert Einstein’s lecture hall, Marie Curie’s laboratory, and others [50]. Of course, they are not historically accurate and are freely stylized for visual aesthetics, but it shows the potential of AR Portals.
Further approaches are proposed by augmenting the physical location with digital information and supporting alternative forms of interaction through the manipulation of physical objects as interactive devices exploited mainly in the context of physical museum installations rather than archaeological sites (e.g., [51]).

2.4. Progress Achieved by this Work

Overall, during the past two decades, an extensive amount of work has targeted technologies with the ability to augment the way that we perceive, visit, interact, and learn cultural heritage. This has inevitably resulted in a plethora of technologies both from the hardware and software side [52,53]. In this work, we focus on creating widely accessible, high-quality CH content by simplifying the implementation cycle and simultaneously targeting mobile AR, virtual reality, and desktop-based presentation and interaction. Since great effort is currently focused on making digitizations suitable for these technologies (e.g., [54,55]), we perform a high-quality optimization of the digitization outcomes once ensuring that the optimized result is suitable for all targeted platforms. To do so, we emphasize texture quality and reduce the mesh complexity, thus delivering visually stunning results on all platforms. Then, we choose a cross-platform game engine for the implementation, and we develop the application core in a cross-platform manner, minimizing the platform-specific implementations [56]. This results in only having to adjust the device and interaction-specific functionalities to achieve cross-platform access. With this strategy, we are managing this work to target Android-based AR-capable devices [57], IOS devices [58], Oculus Quest [28] headsets, and standard desktop computers with mainstream GPUs.

3. Design and Method

This work builds on the digitization of the heritage site, which is extremely important for providing a robust cross-platform model in the form of a reusable 3D asset for immersive application development. Starting from that, we then present the selection of the game engine most suitable for the needs of this work, and we conclude with asset handling in the game engine and mobile device optimizations.

3.1. Digitization of the Heritage Site

For digitization, a combination of technologies was used, including aerial photogrammetry [59] and terrestrial laser scanning [60,61]. For aerial photogrammetry, several flights had to be conducted to extract the 3D outdoor models. The flights were conducted at the lowest possible altitude to obtain the highest possible photo resolution. The lower altitude of the flights was 10 m, and the highest was 30 m. Furthermore, for the terrestrial scans and scans of indoor spaces, a Faro Focus M70 laser scanner solution [62] was used. The scanner was placed at many different points based on the placement plan for full coverage. The data were complemented with panoramic photos and photos that documented several locations of the archaeological site.

3.2. Digitization Post-Processing

Since digitization techniques, such as photogrammetry and laser scanning, were used for the digitization of the archaeological site, it was impossible to import the post-processing results directly into the application. 3D model reconstruction techniques tend to output models with highly unoptimized mesh topology and millions of vertices. As a result, various optimizations were performed to output a usable model suited for the performance standards of AR and VR. Initially, there was a need to unify all laser scans with the aerial photogrammetry mesh creations into one mesh that was suitable for AR and VR applications. The unification of all the scans enabled the creation of high-resolution textures (16k~268.4 megapixels) with approximate isotropic texture distribution following an image-stacking approach. For previewing purposes in the 3D viewer, a 4k-downscaled texture variation of the textures was used, but for the final synthesis in the compositor function, the original 16k textures were used. The detailed digitization process and the proposed methodology are presented in [63]. The results of the aforementioned process are available through a zenodo dataset [64]. Table 1 presents the details of the optimizations in terms of mesh size, comparing the initial registered and mesh scans, and the final optimized meshes used in AR and VR.

3.3. Selection of the Appropriate Game Engine

Since the goal of this work was to provide interaction through three alternative modalities, i.e., personal computers, Android and IOS-based mobile devices, and oculus devices, the selection of the game engine was based on its cross-platform capabilities. The game engine selected was Unity [65], which is a multipurpose engine that allows for quick iterations and high-fidelity 3D rendering of graphics on multiple devices. In particular, the universal rendering pipeline (URP), which includes a node-based shader network and eliminates the need for direct scripting for graphics processing units, was used (GPU). The basic implementation in the case of AR rendering was complemented by integrating ARCore [66]. ARCore is a great library that offers a variety of features for AR uses and, in this work, we used the camera tracking facility.

3.4. Using Digitizations in Unity Cross-Platform Apps

Apart from the post-processing described above, which was performed to implement a single registered model, further post-processing within Unity was performed. Besides the mesh and the texture, which were provided by the reconstruction and optimization methods, each model required a set of box colliders for the user to be able to walk and to be free to explore while also bumping into walls. Additionally, it was decided that each archaeological site would have information bubbles so that, when the user clicks on them, information about the related point would appear on the screen. Therefore, each site required a “package” that contained (a) the mesh, (b) the texture, (c) the set of box colliders, and (d) the positions and indices of the information points. In Unity, this is called a prefab. Prefabs, though, do not contain the actual data; they only contain references to files in the project and how they are placed within the prefab. To export this information in an actual file that contains the data, the prefab needs to be built into an AssetBundle, another file format of Unity. Unity provides a 3D editor in which the transform properties and placement of objects inside the scene can be edited. In this way, a prefab could be created for each site following the diagram shown in Figure 1.

4. Implementation

To support the multimodal presentation of the CH site, we followed a strategy that allowed us to keep the main core of the virtual tour the same, implemented it in Unity 3D, and adapted only the world set-up and the interaction metaphors to each specific variation of the tour. As a result of following this approach, the virtual 3D world remained the same, and what was altered was the way that the user navigated and the modality used for the interaction. This is presented, in this paper, starting from the application core and then the AR implementation, which is the most complex one. Then, we present the required adaptations in VR. Last but not least, the desktop implementation is presented as the simpler alternative, where the mouse point-and-click metaphor is used for user interaction and navigation.

4.1. Application Core

The application core consists of four Unity3D [64] scenes in total. Unity allows you to have multiple scenes, each organized individually and serving its purpose. Out of these scenes, “Introduction” (Figure 2a), “Map 2D” (Figure 2b), “Settings”, and “Information” (Figure 2c) are the scenes that contain a canvas element that is used for UI rendering and navigation.
Introduction: In this scene, the purpose and functionality of the application are introduced in a common multistep fashion. If it is the first time the user launches the app, it also sets its language settings to the device’s default system language. Only English and Greek are supported at the time of the writing of this report.
Map 2D: In “Map 2D”, the user can select the point of interest they would like to learn more about. In “Map 2D”, the user can pan and zoom in and out of a multilayered map of Knossos and click on pins representing the points of interest to load to their information page.
Information: After clicking on a pin either from the 2D or AR map, the user is transferred to the information page of the point of interest. On this page, the user can read some brief information about the point or click on a button to be redirected to the official webpage of Knossos to learn more. On twelve points in total, there is also the option to have a “Virtual Tour”. By clicking on this button, the user is redirected to the “Scene Loader” scene, where the asset bundle will begin downloading. For this page, all the information and the metadata required are stored in a JSON formatted file in the Resources folder of the Unity project.
Settings: In the settings page, the user can change the language of the application and be redirected to the following websites: (a) the Ephorate of Antiquities of Heraklion, (b) terms and conditions of using the application, (c) and the official page of FORTH who designed and developed the application.

4.2. AR Application

To virtually augment the real world, the mobile device with AR capabilities needs to first register itself in the 3D space and constantly update its position. This is required so that if a virtual ball, for example, is augmented on the floor of a room, the ball gives the impression that it stays in the same physical position while the user moves and rotates the device. It is as if the ball is there, and the user’s movement should not affect it. The most common way to achieve this in mobile applications is by using the input of the camera to create and compare key features of each image. By tracking and comparing an individual key feature from two frames, the software in use can understand how the device moved concerning the key feature in 3D space. By doing so for multiple features in each image, accurate camera tracking is achieved. This implies that, by simply using the bare minimum of AR requirements, a lot of processing power is drained from the device. Hence, the application needs to be properly optimized to achieve an acceptable frames-per-second (FPS) count.
The navigation is accomplished by using either “Map2D” or “Map AR”, which are additional components added specially for the AR implementation. From the “Map AR” page, the user can scan the physical map of Knossos that is installed at the heritage site to augment it with 3D-labeled pins for each point of interest (Figure 3a). Then, when selecting a pin, the “Scene Loader” component loads the scene in AR mode. To do so, “Scene Loader” uses the metadata passed from the “Information” page to load and store the specified asset bundle. It begins with a loading screen presenting the site that was selected, and when everything is ready, it spawns the user to the designated starting position. If it is the first time the user enters the “Scene Loader” scene, the user is presented with two short navigation tips in the form of small popup windows as shown in Figure 3b and Figure 4a. At this point, the user is in AR and can freely move their device; the user’s movements will have a one-to-one correspondence with the virtual environment. In addition, the user is provided with a small joystick on the bottom right of the screen to walk longer distances than their physical space would probably allow. Finally, the user can click on information stops indicated with a circle and an “i“ letter to read more information about the specific location. If the user tries to load the same site at a later point, it will not need to be downloaded since it will already be stored in the device’s local storage. The overall application flow is presented in Figure 4b.
Regarding storage issues on mobile devices, storing everything in the device is not a scalable solution. Each asset bundle for each archaeological site would be approximately 40 to 80 Mbytes. This is sufficient for desktop-based applications but poses severe requirements on the mobile application targets. By including all assets, the mobile application file would require between 1 to 1.5 GBytes of data storage. Therefore, the asset bundles are loaded remotely through a web server upon starting the virtual tour of a site. Additionally, when an asset bundle is downloaded, it is also saved on the device’s local storage; as a result, when the same site is loaded, it will not need to be downloaded again. A problem worth mentioning with this approach was that, for each build platform that the application targets, separate asset bundles need to be built. For example, in our application, we support Android and IOS. Therefore, two asset bundles need to exist for every site, one built for targeting Android and a different one built for targeting IOS.
The app was submitted and accepted in both Google Play [67] and the App Store [68]. Examples of digitizations and the AR app can also be accessed in the form of rendered videos through Zenodo (https://doi.org/10.5281/zenodo.7431760 (accessed on 15 January 2023)).

4.3. VR App

For the VR app, the application core remains the same. In addition, Unity’s own VR library, XR SDK, was the main library used in this project [69]. By using XR SDK, the developing cycle of the VR app was greatly accelerated, and vast portability between different HMDs was easily achieved through Unity’s multi-platform support. While developing the application, the Oculus (Meta) Quest 2 was the main testing unit, which is currently the industry’s standard and most affordable stand-alone VR headset.
VR development is a notorious field when it comes to performance optimizations since every frame has to be rendered twice in the system; specifically, one frame for each eye of the user has to be rendered to simulate spatial depth [70]. The cost of rendering everything twice is well paid in performance, meaning that we have to compromise with lower fidelity graphics and optimize as much as possible. When we allow for low frame rates or frame drops in VR applications, it may cause the user to feel nauseous and ultimately create negative feelings both towards the tour of the cave and the experience of VR as a whole [71]. Luckily enough, we performed this task when optimizing the digitization for AR rendering; therefore, no further post-processing was required, and the VR was able to achieve a sufficient frame rate while rendering the virtual scenes.
Of course, not all scenes could be loaded at once and still sustain the same frame rate. Our solution to the problem was to divide and conquer; we used the split AR scenes, and each one was used to represent a “game level” on the VR application. To traverse through each scene, the user will enter portals or use Map AR and Map 2D.
Regarding navigation in the VR space, it is an industry-standard that at least two methods of movement should be provided to the user in a VR environment: (a) joystick walk and (b) point and teleport [72,73,74]. Using the Oculus controllers, virtual hands are visualized in the scene (Figure 5). The user has the option to move as if they are walking by pushing the left-hand joystick in the desired direction. Also, by clicking the right-hand joystick, the user can change the direction in which they are looking by 30°. In addition, by pressing the trigger of the right hand, the user can target a position on the floor and teleport there instantaneously (Figure 6). The reason for supporting these methods of movement is that some people tend to feel nauseous when using only the walking method. Finally, using the same controllers, the user is interacting with the 2D interfaces of the application core.

4.4. Desktop App

For the desktop app, the implementation was very straightforward, since the entire model could be loaded through asset bundles from the local storage, reducing loading times and network bandwidth. Regarding interaction with the application, an approach used in computer-based action games is followed. The mouse pointer is hooked to the camera in the 3D world, and its movement is directly mapped in a movement of the “virtual eyes” location in the world. At the same time, the arrows combination or the AWSD keys are used to move in the virtual space. Finally, user selection is performed using the space key.

5. Conclusions, Synopsis, and Lessons Learned

In this paper, we presented our efforts toward making available in multiple forms and through multiple interaction and presentation modalities the digitization of the Palace of Knossos. Taking the rendering capabilities of mobile devices as a baseline for our 3D model, we performed the appropriate optimizations to simplify the mesh structure and increase the texture quality to maintain a balance in terms of size and needed resources while sustaining a high texture-quality model. Using this little optimization trick, we were able to reduce the required processing power while delivering an excellent visual result. Furthermore, we split the digitization into asset bundles to allow a small app size with an overhead of real-time downloading of assets when required. Caching was used to maintain the already downloaded assets. In the mobile device, we followed the approach of AR portals accessed within the heritage site to give the dimension of travel, while AR capabilities were used to simulate the digital space within the physical location where the app was accessed. In the VR variation, the mobile device was replaced with an Oculus Quest, and the navigation was facilitated following the industry standard and the default controllers that were provided by the device. Finally, in desktop mode, everything was simplified by allowing the entire mode to be loaded at once, and for interaction, the standard first-person gaming approach was used.
Overall, in this work, we managed to deliver a new approach toward reusing digital assets and technologies across platforms. Traditionally, each of the aforementioned implementations would need the usage of different code bases, different variations of digital assets, and even a different development team since the technologies were domain dependent. With the advance of modern game engines, the evolution of mobile devices, and a well-defined development methodology, we can simplify things by reusing digital assets and maintaining the same code base. Device-specific application libraries were used only for optimizing the interaction per device, and the adaptation code was less than ten percent of the codebase. Furthermore, the digital model was the same with the only variation being split into application bundles to support mobile devices. It was also crucial that using Unity, we were able to target both Android and IOS-based devices that cover the majority of the mobile market.
As a synopsis of our experience, we can safely conclude that the advances in AR and VR technologies and the evolution of game engines have made possible the usage of high-res outputs from 3D reconstruction technologies for the implementation of cross-platform experiences. In this paper, we provided, through these advancements, an alternative way of digitally visiting a well-known heritage site that receives more than 800,000 visitors per year. Apart from the technological success of this endeavor, we are contributing to the sustainability of the heritage site, which is already under extreme pressure due to summer-time over-tourism.

Author Contributions

Conceptualization, N.P. and E.Z.; methodology, E.Z., N.P. and X.Z.; software, Z.P., E.Z., T.E. and A.K.; validation, E.Z., X.Z., N.P. and C.S.; formal analysis, T.E., A.K., E.Z. and N.P.; investigation, E.Z., N.P., X.Z. and C.S.; resources, T.E.; data curation, T.E. and A.K.; writing—original draft preparation, N.P., Z.P., E.Z., X.Z., A.K. and T.E.; writing—review and editing, N.P., Z.P., E.Z., X.Z., A.K. and T.E.; visualization, T.E., A.K., Z.P. and E.Z.; supervision, E.Z., X.Z., N.P. and C.S.; project administration, E.Z.; funding acquisition, N.P. and C.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been conducted in the context of the “Knossos Digital Tour Guide” of the Ephorate of Antiquities of Heraklion of the Hellenic Ministry of Culture and Sports funded by the NSRF 2014–2020—RIS 3 Crete program.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are available in the zenodo open-access repositories, and the datasets are cited in the main body of the document. Access to data is available upon request and approval by the Ephorate of Antiquities of Heraklion and the Greek Ministry of Culture.

Acknowledgments

In this work, we have collaborated with the Ephorate of Antiquities of Heraklion of the Hellenic Ministry of Culture and Sports in the context of the implementation of the project “Knossos Digital Tour Guide” that was funded by the NSRF 2014–2020—RIS 3 Crete program. The authors would like to thank the Ephorate of Antiquities of Heraklion for their valuable collaboration and support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. MacDonald, C. (Ed.) The Oxford Handbook of the Bronze Age Aegean. 2012. Available online: https://doi.org/10.1093/oxfordhb/9780199873609.013.0040 (accessed on 20 February 2023). [CrossRef]
  2. Evans, A.J. The palace of Knossos. Annu. Br. Sch. Athens 1901, 7, 1–120. [Google Scholar] [CrossRef]
  3. Evans, J.D.; Cann, J.R.; Renfrew, A.C.; Cornwall, I.W.; Western, A.C. Excavations in the neolithic settlement of Knossos, 1957–1960. Part I. Annu. Br. Sch. Athens 1964, 59, 132–240. [Google Scholar] [CrossRef]
  4. Minoancrete. Available online: http://www.minoancrete.com/knossos1.htm (accessed on 10 December 2022).
  5. Knossos Palace. Available online: https://knossospalace.gr/ (accessed on 10 December 2022).
  6. Su, C.J. An internet-based virtual exhibition system: Conceptual design and infrastructure. Comput. Ind. Eng. 1998, 35, 615–618. [Google Scholar] [CrossRef]
  7. Lim, J.C. Creating Virtual Exhibitions from an XML-Based Digital Archive. Sage J. 2003, 29, 143–157. [Google Scholar] [CrossRef]
  8. Dumitrescu, G.; Lepadatu, C.; Ciurea, C. Creating Virtual Exhibitions for Educational and Cultural Development. Inform. Econ. Acad. Econ. Stud.—Buchar. Rom. 2014, 18, 102–110. [Google Scholar] [CrossRef]
  9. Foo, S. Online Virtual Exhibitions: Concepts and Design Considerations. DESIDOC J. Libr. Inf. Technol. 2010, 28, 22–34. [Google Scholar] [CrossRef] [Green Version]
  10. Rong, W. Some Thoughts on Using VR Technology to Communicate Culture. Open J. Soc. Sci. 2018, 6, 88–94. [Google Scholar] [CrossRef] [Green Version]
  11. Papagiannakis, G.; Schertenleib, S.; O’Kennedy, B.; Arevalo-Poizat, M.; Magnenat-Thalmann, N.; Stoddart, A.; Thalmann, D. Mixing virtual and real scenes in the site of ancient Pompeii. Comput. Animat. Virtual Worlds 2005, 16, 11–24. [Google Scholar] [CrossRef] [Green Version]
  12. Magnenat-Thalmann, N.; Papagiannakis, G. Virtual worlds and augmented reality in cultural heritage applications. In Recording, Modeling, and Visualization of Cultural Heritage; CRC Press: Boca Raton, FL, USA, 2005; pp. 419–430. [Google Scholar]
  13. Papagiannakis, G.; Magnenat-Thalmann, N. Mobile augmented heritage: Enabling human life in ancient Pompeii. Int. J. Archit. Comput. 2007, 5, 395–415. [Google Scholar] [CrossRef]
  14. Zidianakis, E.; Partarakis, N.; Ntoa, S.; Dimopoulos, A.; Kopidaki, S.; Ntagianta, A.; Ntafotis, E.; Xhako, A.; Pervolarakis, Z.; Kontaki, E.; et al. The Invisible Museum: A User-Centric Platform for Creating Virtual 3D Exhibitions with VR Support. Electronics 2021, 10, 363. [Google Scholar] [CrossRef]
  15. Partarakis, N.N.; Doulgeraki, P.P.; Karuzaki, E.E.; Adami, I.I.; Ntoa, S.S.; Metilli, D.D.; Bartalesi, V.V.; Meghini, C.C.; Marketakis, Y.Y.; Kaplanidi, D.D.; et al. Representation of socio-historical context to support the authoring and presentation of multimodal narratives: The Mingei Online Platform. ACM J. Comput. Cult. Herit. (JOCCH) 2021, 15, 1–26. [Google Scholar] [CrossRef]
  16. Stefanidi, E.; Partarakis, N.; Zabulis, X.; Zikas, P.; Papagiannakis, G.; Magnenat Thalmann, N. TooltY: An approach for the combination of motion capture and 3D reconstruction to present tool usage in 3D environments. In Intelligent Scene Modeling and Human-Computer Interaction; Springer: Cham, Switzerland, 2021; pp. 165–180. [Google Scholar]
  17. Stefanidi, E.; Partarakis, N.; Zabulis, X.; Papagiannakis, G. An approach for the visualization of crafts and machine usage in virtual environments. In Proceedings of the 13th International Conference on Advances in Computer-Human Interactions, Valencia, Spain, 21–25 November 2020; pp. 21–25. [Google Scholar]
  18. Christou, C.; Angus, C.; Loscos, C.; Dettori, A.; Roussou, M. A versatile large-scale multimodal VR system for cultural heritage visualization. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, Munich, Germany, 2–4 November 2006; pp. 133–140. [Google Scholar]
  19. Gaitatzes, A.; Christopoulos, D.; Roussou, M. Reviving the past: Cultural heritage meets virtual reality. In Proceedings of the 2001 Conference on Virtual Reality, Archeology, and Cultural Heritage, Glyfada, Greece, 28–30 November 2001; pp. 103–110. [Google Scholar]
  20. Bruno, F.; Bruno, S.; De Sensi, G.; Luchi, M.L.; Mancuso, S.; Muzzupappa, M. From 3D reconstruction to virtual reality: A complete methodology for digital archaeological exhibition. J. Cult. Herit. 2010, 11, 42–49. [Google Scholar] [CrossRef]
  21. Gonizzi Barsanti, S.; Caruso, G.; Micoli, L.L.; Covarrubias Rodriguez, M.; Guidi, G. 3D visualization of cultural heritage artefacts with virtual reality devices. In Proceedings of the 25th International CIPA Symposium 2015, Copernicus Gesellschaft mbH, Taipei, Taiwan, 31 August–4 September 2015; Volume 40, pp. 165–172. [Google Scholar]
  22. Foni, A.; Papagiannakis, G.; Magnenat-Thalmann, N. A Virtual Heritage Case Study: A Modern Approach to the Revival of Ancient Historical or Archeological Sites Through Application of 3D Real-Time Computer Graphics. Proc. A VIR 2003, 3. [Google Scholar]
  23. Papagiannakis, G.; Ponder, M.; Molet, T.; Kshirsagar, S.; Cordier, F.; Magnenat-Thalmann, N.; Thalmann, D. “LIFEPLUS: Revival of life in ancient Pompeii”. In Proceedings of the Virtual Systems and Multimedia 2002 (VSMM02), Gyeongju, Republic of Korea, September 2002. [Google Scholar]
  24. Magnenat-Thalmann, N.; Foni, A.E.; Papagiannakis, G.; Cadi-Yazli, N. Real Time Animation and Illumination in Ancient Roman Sites. Int. J. Virtual Real. 2007, 6, 11–24. [Google Scholar]
  25. Foni, A.E.; Papagiannakis, G.; Cadi-Yazli, N.; Magnenat-Thalmann, N. Time-dependent illumination, and animation of virtual Hagia-Sophia. Int. J. Archit. Comput. 2007, 5, 283–301. [Google Scholar] [CrossRef]
  26. Skovfoged, M.M.; Viktor, M.; Sokolov, M.K.; Hansen, A.; Nielsen, H.H.; Rodil, K. The tales of the Tokoloshe: Safeguarding intangible cultural heritage using virtual reality. In Proceedings of the Second African Conference for Human Computer Interaction: Thriving Communities, Windhoek, Namibia, 3–7 December 2018; pp. 1–4. [Google Scholar]
  27. Donghui, C.; Guanfa, L.; Wensheng, Z.; Qiyuan, L.; Shuping, B.; Xiaokang, L. Virtual reality technology applied in digitalization of cultural heritage. Clust. Comput. 2019, 22, 10063–10074. [Google Scholar] [CrossRef]
  28. Oculus Quest. Available online: https://www.oculus.com/experiences/quest/?locale=el_GR (accessed on 10 January 2023).
  29. Argyriou, L.; Economou, D.; Bouki, V. Design methodology for 360 immersive video applications: The case study of a cultural heritage virtual tour. Pers. Ubiquitous Comput. 2020, 24, 843–859. [Google Scholar] [CrossRef] [Green Version]
  30. Argyriou, L.; Economou, D.; Bouki, V. 360-degree interactive video application for cultural heritage education. In Proceedings of the 3rd Annual International Conference of the Immersive Learning Research Network, July 2017; Verlag der Technischen Universität Graz: Graz, Austria, 2017. [Google Scholar]
  31. Škola, F.; Rizvić, S.; Cozza, M.; Barbieri, L.; Bruno, F.; Skarlatos, D.; Liarokapis, F. Virtual reality with 360-video storytelling in cultural heritage: Study of presence, engagement, and immersion. Sensors 2020, 20, 5851. [Google Scholar] [CrossRef] [PubMed]
  32. Zhou, C.; Li, Z.; Liu, Y. A measurement study of oculus 360-degree video streaming. In Proceedings of the 8th ACM on Multimedia Systems Conference, Taipei, Taiwan, 20–23 June 2017; pp. 27–37. [Google Scholar]
  33. Lo, W.C.; Fan, C.L.; Lee, J.; Huang, C.Y.; Chen, K.T.; Hsu, C.H. 360 video viewing dataset in head-mounted virtual reality. In Proceedings of the 8th ACM on Multimedia Systems Conference, Taipei, Taiwan, 20–23 June 2017; pp. 211–216. [Google Scholar]
  34. Hajirasouli, A.; Banihashemi, S.; Kumarasuriyar, A.; Talebi, S.; Tabadkani, A. Virtual reality-based digitization for endangered heritage sites: Theoretical framework and application. J. Cult. Herit. 2021, 49, 140–151. [Google Scholar] [CrossRef]
  35. Pribeanu, C.; Balog, A.; Iordache, D.D. Measuring the perceived quality of an AR-based learning application: A multidimensional model. Interact. Learn. Environ. 2017, 25, 482–495. [Google Scholar] [CrossRef]
  36. Irwansyah, F.S.; Yusuf, Y.M.; Farida, I.; Ramdhani, M.A. Augmented reality (AR) technology on the android operating system in chemistry learning. In IOP Conference Series: Materials science and Engineering; IOP Publishing: Bristol, UK, 2018; Volume 288, p. 012068. [Google Scholar]
  37. Moorhouse, N.; Jung, T. Augmented reality to enhance the learning experience in cultural heritage tourism: An experiential learning cycle perspective. Ereview Tour. Res. 2017, 8. [Google Scholar]
  38. Tom Dieck, M.C.; Jung, T.H. Value of augmented reality at cultural heritage sites: A stakeholder approach. J. Destin. Mark. Manag. 2017, 6, 110–117. [Google Scholar] [CrossRef]
  39. Choudary, O.; Charvillat, V.; Grigoras, R.; Gurdjos, P. MARCH: Mobile augmented reality for cultural heritage. In Proceedings of the 17th ACM International Conference on Multimedia, Beijing, China, 19–24 October 2009; pp. 1023–1024. [Google Scholar]
  40. Vlahakis, V.; Karigiannis, J.; Tsotros, M.; Gounaris, M.; Almeida, L.; Stricker, D.; Gleue, T.; Christou, I.T.; Carlucci, R.; Ioannidis, N. Archeoguide: First results of an augmented reality, mobile computing system in cultural heritage sites. Virtual Real. Archeol. Cult. Herit. 2001, 9, 584993–585015. [Google Scholar]
  41. Chung, N.; Lee, H.; Kim, J.Y.; Koo, C. The role of augmented reality for experience-influenced environments: The case of cultural heritage tourism in Korea. J. Travel Res. 2018, 57, 627–643. [Google Scholar] [CrossRef]
  42. Deliyiannis, I.; Papaioannou, G. Augmented reality for archaeological environments on mobile devices: A novel open framework. Mediterr. Archaeol. Archaeom. 2014, 14, 1–10. [Google Scholar]
  43. Pierdicca, R.; Frontoni, E.; Zingaretti, P.; Malinverni, E.S.; Colosi, F.; Orazi, R. Making visible the invisible. augmented reality visualization for 3D reconstructions of archaeological sites. In Proceedings of the Augmented and Virtual Reality: Second International Conference, AVR 2015, Lecce, Italy, 31 August–3 September 2015; Springer: Berlin/Heidelberg, Germany, 2015; pp. 25–37. [Google Scholar]
  44. Panou, C.; Ragia, L.; Dimelli, D.; Mania, K. An architecture for mobile outdoors augmented reality for cultural heritage. ISPRS Int. J. Geo-Inf. 2018, 7, 463. [Google Scholar] [CrossRef] [Green Version]
  45. Fernández-Palacios, B.J.; Nex, F.; Rizzi, A.; Remondino, F. ARCube—The Augmented Reality Cube for Archaeology. Archaeometry 2015, 1, 250–262. [Google Scholar] [CrossRef]
  46. Fernández-Palacios, B.J.; Rizzi, A.; Nex, F. Augmented reality for archaeological finds. In Progress in Cultural Heritage Preservation: 4th International Conference, EuroMed 2012, Limassol, Cyprus, 29 October–3 November 2012; Springer: Berlin/Heidelberg, Germany, 2012; pp. 181–190. [Google Scholar]
  47. Petrucco, C.; Agostini, D. Teaching Cultural Heritage using Mobile Augmented Reality. J. e-Learn. Knowl. Soc. 2016, 12. [Google Scholar]
  48. Tzima, S.; Styliaras, G.; Bassounas, A.; Tzima, M. Harnessing the potential of storytelling and mobile technology in intangible cultural heritage: A case study in early childhood education in sustainability. Sustainability 2020, 12, 9416. [Google Scholar] [CrossRef]
  49. Camuñas-García, D.; Cáceres-Reche, M.P.; de la Encarnación Cambil-Hernández, M. Mobile game-based learning in cultural heritage education: A bibliometric analysis. Educ. Train. 2022. ahead-of-print. [Google Scholar] [CrossRef]
  50. The Historical Figures AR. Available online: https://play.google.com/store/apps/details?id=ca.altkey.thehistoricalfiguresar (accessed on 31 October 2022).
  51. Carre, A.L.; Dubois, A.; Partarakis, N.; Zabulis, X.; Patsiouras, N.; Mantinaki, E.; Zidianakis, E.; Cadi, N.; Baka, E.; Thalmann, N.M.; et al. Mixed-reality demonstration and training of glassblowing. Heritage 2022, 5, 103–128. [Google Scholar] [CrossRef]
  52. Van Krevelen DW, F.; Poelman, R. A survey of augmented reality technologies, applications, and limitations. Int. J. Virtual Real. 2010, 9, 1–20. [Google Scholar] [CrossRef] [Green Version]
  53. Berg, L.P.; Vance, J.M. Industry use of virtual reality in product design and manufacturing: A survey. Virtual Real. 2017, 21, 1–17. [Google Scholar] [CrossRef] [Green Version]
  54. Cipriani, L.; Bertacchi, S.; Bertacchi, G. An optimised workflow for the interactive experience with cultural heritage through reality-based 3d models: Cases study in archaeological and urban complexes. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 4211, 427–434. [Google Scholar] [CrossRef] [Green Version]
  55. Koutsoudis, A.; Arnaoutoglou, F.; Pavlidis, G.; Tsiafakis, D.; Chamzas, C. A versatile workflow for 3D reconstructions and modelling of cultural heritage sites based on open source software. In Proceedings of the Virtual Systems and Multimedia Dedicated to Digital Heritage Conference, Limassol, Cyprus, 20–25 October 2008; pp. 238–244. [Google Scholar]
  56. Multiplatform Game Development for the Future–Unity. Available online: https://unity.com/solutions/multiplatform (accessed on 15 January 2023).
  57. Android ARCore Supported Devices. Available online: https://developers.google.com/ar/devices (accessed on 12 January 2023).
  58. IOS ARCore Supported Devices. Available online: https://developers.google.com/ar/devices#ios (accessed on 12 January 2023).
  59. Nikolakopoulos, K.G.; Soura, K.; Koukouvelas, I.K.; Argyropoulos, N.G. UAV vs classical aerial photogrammetry for archaeological studies. J. Archaeol. Sci. Rep. 2017, 14, 758–773. [Google Scholar] [CrossRef]
  60. Vozikis, G.; Haring, A.; Vozikis, E.; Kraus, K. Laser Scanning: A New Method for Recording and Documentation in Archaeology. In Proceedings of the FIG Working Week 2004; Athens, Greece: 22–27 May 2004.
  61. Lercari, N. Terrestrial Laser Scanning in the Age of Sensing; Springer: Berlin/Heidelberg, Germany, 2016; pp. 3–33. [Google Scholar]
  62. FARO Focus Laser Scanners. Available online: https://www.faro.com/en/Products/Hardware/Focus-Laser-Scanners, (accessed on 20 March 2022).
  63. Pervolarakis, Z.; Zidianakis, E.; Katzourakis, A.; Evdaimon, T.; Partarakis, N.; Zabulis, X.; Stephanidis, C. Three-Dimensional Digitization of Archaeological Sites—The Use Case of the Palace of Knossos. Heritage 2023, 6, 904–927. [Google Scholar] [CrossRef]
  64. Evdaimon, T.; Katzourakis, A.; Pervolarakis, Z.; Partarakis, N.; Zidianakis, E.; Zabulis, X. A collection of indoor and outdoor digitizations of the Knossos Palace. Zenodo 2022. [CrossRef]
  65. Unity—A Real-Time Development Platform. Available online: https://www.unity.com (accessed on 8 March 2022).
  66. ARCore. Available online: https://developers.google.com/ar (accessed on 20 September 2022).
  67. Knossos Palace AR-Apps on Google Play. Available online: https://play.google.com/store/apps/details?id=com.FORTH.KnossosPalaceAR (accessed on 10 January 2023).
  68. Knossos Palace AR-Apps on IOS Store. Available online: https://apps.apple.com/app/knossos-palace-ar/id6443486183 (accessed on 10 January 2023).
  69. XR SDK. Available online: https://docs.unity3d.com/Manual/xr-sdk.html (accessed on 10 January 2023).
  70. Optimizing for VR/AR. Available online: https://learn.unity.com/tutorial/optimizing-your-vr-ar-experiences (accessed on 10 January 2023).
  71. Regan, C. An investigation into nausea and other side-effects of head-coupled immersive virtual reality. Virtual Real. 1995, 1, 17–31. [Google Scholar] [CrossRef]
  72. Desai, P.R.; Desai, P.N.; Ajmera, K.D.; Mehta, K. A review paper on oculus rift-a virtual reality headset. arXiv 2014, arXiv:1408.1173. [Google Scholar]
  73. Anthes, C.; García-Hernández, R.J.; Wiedemann, M.; Kranzlmüller, D. State of the art of virtual reality technology. In Proceedings of the 2016 IEEE Aerospace Conference, Big Sky, MT, USA, 5–12 March 2016; pp. 1–19. [Google Scholar]
  74. Seibert, J.; Shafer, D.M. Control mapping in virtual reality: Effects on the spatial presence and controller naturalness. Virtual Real. 2018, 22, 79–88. [Google Scholar] [CrossRef]
Figure 1. The contents of a prefab.
Figure 1. The contents of a prefab.
Heritage 06 00131 g001
Figure 2. Screenshots of the available menu pages of the application. (a) Introduction, (b) Map2D, (c) Information panel.
Figure 2. Screenshots of the available menu pages of the application. (a) Introduction, (b) Map2D, (c) Information panel.
Heritage 06 00131 g002
Figure 3. Knossos AR app (a) Screenshots of the AR, (b) virtual tour of the “Throne Room” (upper right figure), and “Temple Tomb” (lower right figure).
Figure 3. Knossos AR app (a) Screenshots of the AR, (b) virtual tour of the “Throne Room” (upper right figure), and “Temple Tomb” (lower right figure).
Heritage 06 00131 g003
Figure 4. (a) Screenshots of the AR virtual tour and (b) a diagram of the application’s available scenes and transitions.
Figure 4. (a) Screenshots of the AR virtual tour and (b) a diagram of the application’s available scenes and transitions.
Heritage 06 00131 g004
Figure 5. Mapping VR controllers to virtual hands in the 3D space.
Figure 5. Mapping VR controllers to virtual hands in the 3D space.
Heritage 06 00131 g005
Figure 6. Teleporting between scenes in VR.
Figure 6. Teleporting between scenes in VR.
Heritage 06 00131 g006
Table 1. Details of mesh level optimization per site.
Table 1. Details of mesh level optimization per site.
Point of InterestNumber of Scans Total Faces (Merged Model)Total Faces
(Simplified Model)
Reduction
Queen’s room 14~4,550,000~326,50092.82%
The Hall of the Double Axes and the Queen’s Megaron18~5,850,000~305,00094.79%
Throne room 52~16,900,000~323,20098.09%
North Lustral Basin11~3,575,000~392,00089.03%
Caravan Serai13~4,225,000 and ~5,000,000 from aerial scanning~424,00095.40%
Little Palace at Knossos25~8,125,000 and ~5,000,000 from aerial scanning~1,472,50088.78%
South House30~9,750,000~326,00096.66%
South Propylaeum12~3,900,000~328,26091.58%
North Entrance, North Pillar Hall13~4,225,000~645,40084.72%
Royal Villa at Knossos1824~7,800,000 and ~965,000 from aerial scanning~565,00093.55%
Temple Tomb22~7,150,000 and ~5,000,000 from aerial scanning ~326,65097.31%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pervolarakis, Z.; Zidianakis, E.; Katzourakis, A.; Evdaimon, T.; Partarakis, N.; Zabulis, X.; Stephanidis, C. Visiting Heritage Sites in AR and VR. Heritage 2023, 6, 2489-2502. https://doi.org/10.3390/heritage6030131

AMA Style

Pervolarakis Z, Zidianakis E, Katzourakis A, Evdaimon T, Partarakis N, Zabulis X, Stephanidis C. Visiting Heritage Sites in AR and VR. Heritage. 2023; 6(3):2489-2502. https://doi.org/10.3390/heritage6030131

Chicago/Turabian Style

Pervolarakis, Zacharias, Emmanouil Zidianakis, Antonis Katzourakis, Theodoros Evdaimon, Nikolaos Partarakis, Xenophon Zabulis, and Constantine Stephanidis. 2023. "Visiting Heritage Sites in AR and VR" Heritage 6, no. 3: 2489-2502. https://doi.org/10.3390/heritage6030131

Article Metrics

Back to TopTop