Next Article in Journal
Comprehensive Framework for Describing Interactive Sound Installations: Highlighting Trends through a Systematic Review
Previous Article in Journal
An Exploratory Study on the Impact of Collective Immersion on Learning and Learning Experience
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Designing a Virtual Arboretum as an Immersive, Multimodal, Interactive, Data Visualization Virtual Field Trip

Games and Interactive Media, Nicholson School of Communication and Media, University of Central Florida, 4000 Central Florida Blvd, Orlando, FL 32816-8005, USA
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2021, 5(4), 18; https://doi.org/10.3390/mti5040018
Submission received: 3 February 2021 / Revised: 30 March 2021 / Accepted: 6 April 2021 / Published: 9 April 2021

Abstract

:
This paper describes a virtual field trip application as a new type of immersive, multimodal, interactive, data visualization of a virtual arboretum. Deployed in a game engine, it is a large, open-world simulation, representing 100 hectares and ideal for use when free choice in navigation and high fidelity are required. Although the computer graphics are photorealistic, it is different and unique from other applications that use game art or 2D 360-degree video, because it reflects high information fidelity as a result of the domain expert review, and the integration of geographic information system (GIS) data with drone images. Combined in-game as a data visualization, it is ideal for generating past or future worlds, in addition to representations of the present. Fusing information from many data sources—terrain data, waterbody data, plant inventory, population density data, accurate plant models, bioacoustics, and drone images—its design process and methods could be repeated and used in a wide range of augmented reality (AR) and virtual reality (VR) applications and devices. Results on presence, embodiment, emotions, engagement, and learning are summarized from prior pilot studies for context on use, and are relevant to schools, museums, arboretums, and botanical gardens interested in developing immersive informal learning applications.

1. Introduction

This paper presents a case study describing the construction methods used to design and develop immersive, multimodal, interactive models of virtual nature. Using 3D data visualizations of scientific data related to plants, ecologies, terrain, and waterbodies to create large geospatial, immersive, photorealistic models at a true scale, these spatial virtual landscapes can accurately reflect real-world landscapes. As such, these virtual nature models may be displayed in augmented reality (AR) and virtual reality (VR) devices as immersive game level environments. However, unlike traditional game level environments, these visualize reality to be experienced perceptually, emotionally, and cognitively, much like an AR or VR Holodeck: walk through the virtual forest, look up to see the canopy, bend down to inspect the small details in a flower, explore wherever you want, go off trail, select a plant to open a virtual field guide, learn about the natural world. The main importance and interest in this work lie in the unique design and novel processes combined to fuse information from multiple sources to create a high information fidelity environment as a photorealistic immersive experience of nature. The human experience is perceptual, experiential, embodied, and cognitive, allowing the non-expert to “see” and experience a reality that only a domain expert could imagine, one based on deep knowledge. The non-expert’s Gestalt moment of insight is to see the world through the eyes of an expert and then develop a new intuitive understanding of knowledge about the natural world. This paper focuses on the larger of the two virtual nature applications, The Virtual UCF Arboretum (Figure 1) (see Supplementary Materials), to document the design, development, and construction process for virtual nature immersive environments and summarizes previously published results for a related project, The AR Perpetual Garden (Figure 2) (see Supplementary Materials), to provide insight into its use as an immersive informal learning application. The primary focus of this paper is to provide a detailed document of the design, development, and construction of the virtual nature immersive application so that researchers and practitioners may replicate the work. Importantly, an argument is presented in support of factual accuracy in terms of both content and presentation for applications used for education and learning. Critically for effective educational and learning applications, the media should accurately reflect the facts. Not only should the facts, data, and educational story be accurate, but the visualization, presentation, and transmission must also be correct. Also developed is a definition of information fidelity and how it interacts with the curriculum, photorealism, and immersion to extend and define the critical design elements of concern for immersive informal learning applications, especially when modeling the natural world. The results might be helpful to developers and practitioners alike who wish to replicate the design and develop their own virtual nature immersive learning applications with AR and VR technologies.

2. Background and Overview

2.1. Description of the Virtual Nature Applications

Two virtual nature applications have been developed using the same processes. The Virtual UCF Arboretum and The AR Perpetual Garden Apps were developed in support of the long-term research objectives to better understand the impacts of immersive informal learning applications on human cognitive, emotional, behavioral, and attitudinal outcomes, especially in virtual environments that support the free exploration of photorealistic physical space and free inquiry in an invisible web of knowledge [1]. Both applications were designed and developed in the same way, but for different output devices (AR or VR), based on the project requirements of the stakeholders. Launched in 2018, both are available to the public as either a free desktop game version on Itch.io or on the Apple iTunes and Google Android Play App stores (https://the-harrington-lab.itch.io/ accessed on 15 March 2021).
The design patterns used to construct the larger application, The Virtual UCF Arboretum, are detailed in this paper. To provide context for the learning effects of such applications developed as virtual nature simulations, summary results from a previously published report in a 2019 study of The AR Perpetual Garden Apps are presented. Preliminary pilot data from the ongoing arboretum study appear to confirm the results. Both applications support research investigations of immersive informal learning applications in the context of virtual nature as content. Interactivity supports the user/learner’s goals and allows them to satisfy their personal curiosity in the moment, making the combined design features of these applications ideal for simulations of real field trips or useful to prime, transfer, and reinforce an experiential learning activity [2]. Immersive virtual field trips, especially as educational applications for geoscience, forestry, ecology, and biology education, could prove to be valuable to natural history museums and classrooms by bringing the natural world into these traditional learning environments. With a research focus on ecology, forestry, and biology education, this work used the virtual nature model to accurately reflect the visual details and complexity of the educational content, the real plants, and the context of the real environment to communicate the salient attributes of the botany of plants and the context of the ecology needed for the application’s learning outcomes to be achieved. The aim of this paper is to report on the design process used to construct these virtual nature models to achieve realism in the application.

2.2. Virtual Nature: The Virtual UCF Arboretum Construction Process

The Virtual UCF Arboretum is a simulation of a field trip. It is an investigation of the design, development, and evaluation of immersive, multimodal, interactive, and informal learning applications. Created to work as a simulation of an educational field trip, it is unique in that it visualizes the geographical information system (GIS) data of native plants, flowers, birds, insects, amphibians, and fish over a large 100 hectares (247 acres) open world of central Florida terrain in the Unreal Game Engine. It is a combination of three main system components, a Virtual Nature Model annotated with facts and concepts using a Virtual Plant Field Guide and a Virtual Plant Atlas. The application was completed in October 2018, expanded to VR HTC Vive headsets in November 2018, and connected to a treadmill in 2019. The Virtual UCF Arboretum project (2016–present) is a collaborative and interdisciplinary research project between digital media, learning sciences, biology, ecology, and urban landscape planning resource departments at a major university. The research objectives are as follows: (1) to use it as an AR and VR technical test bed to study college-level biology field-work; (2) to study the informal learning impacts of design features from an evolutionary biology perspective—e.g., why do humans explore the unknown and why are they motivated to learn independently; and (3) to study the health and wellbeing impacts of nature on the mind and body—e.g., why does nature feel good?

2.3. Virtual Nature: The AR Perpetual Garden App Learning Impacts

The AR Perpetual Garden Apps (2015–present) work on both Apple and Android AR-enabled devices. Much like the botanical prints or natural history museum dioramas and exhibits of the past, the quality and trustworthiness of the artifact depends on the visual expressions of factual information required for effective and accurate scientific communication embedded in the media [3]. This AR app was developed in a large interdisciplinary collaboration for use inside and outside the Carnegie Museum of Natural History to extend the learning impact of the real museum dioramas and gardens as immersive virtual exhibits for the public. This AR implementation was unique in May of 2018, because it was one of the first to use the vision tracking features of ARCore and ARKit software development kits SDKs to create data visualized as immersive, multimodal, and interactive AR virtual environments (3 × 3 m square) to annotate real-world environments [4]. This design went beyond the past uses of AR as simple text, images, or websites as virtual tool-tip annotations on the real world to present a complete and accurate AR geospatial environment (AR Holodecks) to show alternative realities in the context of current reality. The immersive multimodal data environments of the two scenarios reflect scientific data sets and expert knowledge of plants, birds, and insects to present two contrasting scenarios—Woodland in Balance and Woodland out of Balance—for experiential learning. It too is a combination of three main system components—a Virtual Nature Model annotated with facts and concepts using a Virtual Plant Field Guide containing a Virtual Plant Atlas.
Summary data from a 2019 pilot study conducted in a museum of natural history with volunteer participants from the general public (n = 56) are presented to highlight the effective design factors. Subjects were observed as “running through” a field of flowers and “bending down to closely inspect a bloom” in a virtual springtime woodland landscape projected over an empty museum gallery hall floor (Figure 2), indicating high levels of presence, embodiment, and engagement [5]. The data visualization of the immersive virtual nature environment was significantly correlated with the interactive audio, “Story”. Both of the design features, “Story” and “Facts”, were significantly correlated to actual learning gains (40%) on test scores [6]. Such research findings on design factors and combined patterns inform future immersive informal learning application design, especially when factual content interacts with the photorealistic, multimodal, and interactive capabilities of game engines and immersive hardware devices.

3. Motivation and Contribution

The original motivation was driven by the direct observation of how students were highly motivated and engaged in learning activities on real field trips and the immergence of off-the-shelf game engines used in combination with immersive hardware to create simulations of large open worlds to construct a similar virtual learning experience. Parents often lack the knowledge to offer factually correct answers to questions on field trips or visits to museums, leaving their children with misrepresentations of reality (e.g., answering “oh, that is a lizard”, when it in reality it is a salamander). Part of the problem with a socially constructed reality is that the “authority” is often not the domain expert and they are wrong. As observed in the 2019 study, as mothers rested on the couch, their children were observed to independently explore, challenging the extensive literature on informal learning in museums driven by social collaborative interaction [7]. Part of the challenge was to design an application that could work for a child exploring on their own. When there is collaboration and socially constructed learning, this design satisfies curiosity for both the parent and the child by opening a semantically linked authoritative factual resource. This educational story is specific to each virtual object and context and important for learning correct representations of the world, because they are authored, vetted, and approved for use by the domain experts critical for scientific communication.
The Virtual UCF Arboretum allows the user to take a leisurely virtual hike for as long as they wish and in any direction desired, even off-trail in this wild Central Florida environment. Much like a real hike, they experience nature in an egocentric view as a realistic visual and audio landscape based on biological research field data. It is not your typical fantasy game level, nor an immersive 2D 360 video that restricts navigation to the trail, but a new type of artificial knowledge artifact representing the plants and animals found in the real environment. It reflects the correct species inventories, population densities, and botanically correct 3D plant models as sensory patterns of data, perceived in context as ecological information. Students react to the “sharpness” of a Saw Palmetto (Serenoa repens) and are hesitant to enter the thicket. A research scientist from New York in the winter experiences the “warmth” of the Florida sun and smiles. No one enters the pond, cautious of a potential alligator encounter. An American bald eagle (Haliaeetus leucocephalus) cries; a student looks up and wonders why it does not sound as expected, like a hawk. Bioacoustics are accurate to the insects and birds of each location and to the season, based on research data of species counts, inventory, and population densities, offering an enhanced multimodal sensory experience en route. The vision-impaired may independently explore and experience the sounds of nature in a spatial environment in ways that are not possible in the real world. It is possible to virtually walk off trail through the plants and hear the crunching sound of plants and earth, as well as birds, frogs, and insects as they move through the virtual world. When demonstrated to the public at a science center, those with physical disabilities and in wheelchairs found the work to be especially exciting, engaging, and enjoyable.
The high information fidelity of these virtual nature applications is unique because they are not built as traditional game level environmental art using fantasy content, nor are they “immersive” 2D 360 video VR, because they allow the full exploration of a large open world driven by the user of the system and not pre-determined by the designer. As a three-dimensional, real-time interactive data visualization of GIS data, this artifact becomes a virtual model of the plants, and when those plants are combined correctly on a terrain, a highly realistic and accurate virtual landscape is produced. It is also like walking through the glass of, and into, a diorama or exhibit found in natural history museums. Such applications may be used to reinforce, not to replace, real field trips or real field-work [8].
The virtual nature environment is fully interactive, consistent with Slater’s theories on presence [9], supporting design affordances unique to VR immersive applications that produce place and plausibility illusion because the content of these virtual nature applications accurately represents the data that produce the ambient environmental signals with high information fidelity. The user experiences are thus directly related to the perception of the transmitted signals, in total, or to the summation of the quality of the factual content based on data and expressed fully with minimal information loss as photorealistic and ambient audio digital signals, bounded by the immersive capacities of the hardware output devices.
This is very different to Bowman’s definition of “graphical fidelity” and “immersive applications” [10]. The design factors of the output devices alone—an iPad, HoloLens, or HTC Vive—will transmit the information fidelity of the content constrained by the resolution and field of view of the device, whether it is a cartoon, a high-definition film, or a stylized or photorealistic fantasy game. The information fidelity is higher in the film than the cartoon and is low in any fantasy game, independent of style, photorealism, or the “immersive” level of the device. Photorealism, or graphical fidelity, is a byproduct of the high information fidelity and the software rendering capabilities and is enhanced by the immersive devices’ hardware properties for the transmission of signals to the human to perceive. The application’s objective is to support the human in the free-choice exploration of that immersive environment for engaged and intrinsic learning, very different to Dede’s use of the word “immersive” [11] in the education literature that places the learner into a structured scenario, framing perception and choices, and reducing free choice.
High information fidelity, from an information science perspective, is defined in terms of the accuracy, completeness, and assurance of all of the signals transmitted from the environment and perceived by the human through their sense organs. It is what makes this photorealistic, multimodal, immersive, and interactive virtual arboretum unique, different, and powerful. Past research with virtual nature found significant interaction effects between the independent variables of visual fidelity and navigational freedom of knowledge gained, with the highest actual learning gains in the combined conditions (37%) [1]. Free choice in navigation and wayfinding, not bounded to a route created at the point of recording or restricted to a game designer’s pre-determined path, appears to also enhance presence and embodiment, as shown in other mixed-reality applications [12,13]. Presence appears to be enhanced by the information fidelity, which, when combined with freedom in exploration, produces realistic perceptions and interactions, especially when displayed in AR and VR devices that transmit signals at high resolutions and wide field of views, and at true scale.
At individual points of personal interest, whatever an individual perceives as salient—a bright red flower, a plant with delicious fruit, a butterfly to follow—the user may select the object for more information with a simple click below the activated cross-hair indicator to satisfy a spontaneous emotional reaction to the beauty of the virtual natural world, satisfying an innate human need to know and understand the world. That salient event is a combination of the user’s response to the signals in the virtual environment combined with emotions, perceptions, memory, and an instinctual decision response to take action (e.g., clicking on the object), which is supported in the design with an interactive virtual field guide (Figure 3) and educational story used to satisfy curiosity in the moment. The VR technical design component is similar to a heads-up display (HUD) user interface (UI) common in many games or other 3D UI VR applications. Many AR applications use the UI annotation as an overlay on top of real-world objects as a type of virtual tool-tip. Such design factors represent interaction functionality that are semantically context-sensitive. This definition of interaction is more closely related to Bowman’s work on immersive 3D UIs than Slater’s. Such semantic interactions enhance the presence experienced by increasing engagement through active learning and enhanced perception driven by knowledge gained about the world. To learn to see the world is to learn the meaning of what is seen.
The PC version can be used on a desktop or in an immersive VR headset environment, as well as a version that is connected to a treadmill. As the virtual nature model was developed using the Unreal Engine, it was easily extended to work with a VR headset (e.g., HTC Vive) and hand controllers to study emotional reactions needed to support learning outcomes (Figure 4a). VR headset navigation is more constrained than the desktop or treadmill versions and is supported with teleportation to avoid cybersickness. The treadmill version is expected to enhance presence with the added benefits of kinesthetic sensory feedback loops to increase embodiment, engagement, and emotional reactions and to facilitate future spatial cognition research broadly and learning research specifically (Figure 4b). An immersive AR version is being developed for the arboretum with optimized assets for a future study.

4. Related Work

4.1. Game Engines, Cultural Heritage, Field Trips, and Visualizations of Prior Work

Game engines have been used for non-entertainment purposes in the past (e.g., Serious Games movement, Edutainment, Games for Change). Game engines have been used to construct interactive interpretive narratives for immersive historical experiences—“Walden, a game”—as a new form of digital literature [14]. Others have explored application design using mobile AR technologies for learning in the context of museums and cultural institutions aimed at communicating culturally or historically significant knowledge to the public and/or for research [15]. Many of these projects used game engines as a general-purpose real-time, interactive, multimodal, graphics-rendering application and combined them with a wide range of data capture and visualization display methods to create virtual models that are representative of real artifacts, collections, and environments [16]. The Smithsonian Natural History Museum recently created “The Skin and Bones App”, an augmented reality application with exceptionally high information fidelity and with high context-sensitivity [17], as a means to visualize the invisible and engage the public with 3D AR annotations extended with multimedia content to support informal learning activities in the context of the real exhibits [18]. The Carnegie Museum of Natural History extended the use of immersive AR with TheAR Perpetual Garden Apps by transforming an entire empty gallery hall into a virtual woodland environment [4]; this design went beyond traditional AR annotations of real artifacts to produce an immersive environment as a complete and believable alternate reality. The immersive illusion as a visualization of the domain experts’ knowledge and imagination produced a complete virtual environment that transformed the real space into a believable environment of nature, even when displayed on iPads, supporting the claim that presence may be achieved even when using a small handheld device.
National parks, botanical gardens, and arboretums offer real and virtual field trips as informal learning experiences to engage with and teach the public about these living cultural treasures, often in combination with 2D web resources as non-immersive augmented reality applications [19]. Formal educational research studies on environmental science, biology, and chemistry have also integrated digital media, simulations, and game technology for study [11]. Although many 2D virtual field trips exist, very few immersive virtual field trips have been created, despite VR having been recognized as an ideal tool for environmental education [20]. Recently, research for education has turned to understanding how simulations of virtual field-work impact learning experiences and behavior in the classroom for students and teachers [21]. These real-world learning environments are highly complex digital artifacts, making the internal validity in the combined research design and digital media artifact difficult to isolate. Most prior studies have focused on the social interactions as important factors in learning outcomes, but totally ignored the design factors in the applications themselves as factors and potential triggers for exploration, inquiry, and learning. This research is important for the child who is alone when using the application. Thus, there is a large gap in what is known about how the combined hardware technology, software design, media content, and application usability interact. How does the design, alone, impact human perception, emotion, cognition, behavior, and attitudes about the meaning presented and transmitted to the individual learner in situ?
In order for these virtual nature applications to be effective virtual field trips, the visualization must represent data, facts, and educational concepts accurately. Foresters, ecologists, and biologists have such data, but it is presented as non-immersive pseudo-colored 2D maps [22]. These are great in terms of accuracy, but not for triggering the emotional reactions—such as awe, wonder, and curiosity—recognized as important for engagement and especially for motivating children [2]. The contribution of this work was to visualize the plant GIS data as 3D data visualizations of nature rendered in photorealistic detail, which was made possible by the game engine, then to display it in immersive hardware devices to achieve both actual learning and emotional engagement as combined outcomes of the design while offering free exploration.

4.2. Information Fidelity (Facts) Important for Knowledge Acquisition (Learning)

Although the difference in “immersion” between one device over others available in today’s commercial VR headsets or AR devices has been studied, little attention has been given to the huge difference in the quality of the content, the multimodal signals of 3D digital information transmitted by the applications, the information fidelity, and its relationship to ground-truth data, outside of traditional abstract data visualizations or scientific visualizations, to investigate a new type of knowledge visualization. The accuracy, completeness, and assurance of data are of high importance in the use of geographical information systems. Even when displayed in immersive hardware, they are still abstract representations of that data (e.g., dots presented as a stereoscopic, interactive 3D map). Such abstract visualizations have shown advantages for user exploration, hypothesis generation, details on demand, and learning, often producing significant insight [23] (pp. 552–569). Such advantages in the human–computer interface design produced by traditional data visualizations systems could be transferred to immersive learning applications to support learning objectives for the learner for open and free-choice exploration and inquiry.
The content transmitted through the graphical rendering software and the immersive properties in the hardware devices offers a range of information fidelity from low to high levels, introducing complexity when the intent of the design is to support learning. Such a range in content quality, informational and graphical fidelity, and immersive display devices strains the definition of “immersive learning environments” but offers clues regarding the design factors of importance when both learning and emotional engagement are desired. Content factual accuracy, or alignment with an educational curriculum, is not important for “immersive entertainment applications” where the content may be artistic fantasy (e.g., Disney Imagineering), but it is critical when learning objectives are required directly from the application design.
An example of low information fidelity transmitted by a high-immersion hardware installation offered to the public for informal learning was the “Connected Worlds” exhibit at the New York Hall of Science. It showed no actual learning outcomes, even when supporting social and collaborative interactions, but high levels of presence, embodiment, and engagement. The fantasy 2D cartoon content experienced in an immersive mixed-reality environment appeared to result in high levels of engagement and embodiment, but no factual knowledge gained was measured by the self-reported perceived learning outcomes measured in surveys [24]. Although it was fun, and the children thought they had learned when interviewed, the actual learning gains were not measured. In contrast to this cartoon-immersive application was a large and beautiful natural history-inspired artistic, immersive experience that also appeared to emotionally engage and offer embodiment in first person perspectives of animals in a Brazilian rainforest, “Inside Tumucumaque” [25]. Once again, however, actual learning gains were not measured.
In other examples of research, some have shown actual learning outcomes when the content is factual. These applications used immersive AR devices to display 2D photorealistic content and annotate the real-world environments in context with teachers and guides. The causal factors for the learning results could be attributed to the socially mediated experiences and not the design of the application itself [11,19]. Experienced naturalists, teachers, or guides can produce high levels of engagement and excitement with a good story, independent of the digital media artifact used in that context. Interestingly, when other applications offer a high factual content, transmitted as photorealistic 2D-360 video and displayed in immersive VR headsets, the learning results are mixed [26], but these designs restricted free exploration, blocking the pursuit of the salient, blunting curiosity, and thus possibly limiting the learning effects.
A few application designs have shown positive effects on actual learning outcomes or measurements important for learning outcomes [5,6,12,13,15,18,27]. The design factors in common to these immersive informal learning applications consist of high information fidelity, represented as a combination of high factual content accuracy; high photorealism; immersive display devices; multimodal virtual environments that are open to full body exploration and embodiment; high context-sensitivity of virtual-real information annotations; interaction to support inquiry and multimedia to support multimodal learning activity; high user engagement with emotional responses; and a high user-experience evaluation of the tool, shown to match the learner-users’ goals, intentions, tasks, and desired activities.

5. Materials and Methods

5.1. Data and Software Applications Used to Develop the Game Level Environment

Many non-entertainment applications have been designed and developed with game engines. The reasons for this are many, including that they are free and offer ease of use in the construction of large open-worlds and environments, and that they offer an optimized output pipeline process to produce AR and VR applications. The current versions of the Unity and Unreal game engines offer advanced photorealistic computer graphics rendering and real-time, interactive, multimodal, and user interface features. Both game engines support application programming interfaces (APIs) to connect external data sources to control in-game dynamics, such as lighting and wind effects, to enhance the game environments as simulations. They both support a wide array of user interface tools to generate overlays and visualizations for attention direction and control and user interaction. Recently, both game engines have started to offer streamlined development pipeline processes to make immersive AR and VR easier than in the past and to support a wider array of manufacturers.
These tools support automatic terrain construction using height maps, making them ideal for visualizing terrain and GIS data. There are many ways to import real terrain data to automatically generate a realistic environment as a game level. Game engines also support location-based sound and level of detail (LODs) optimization for multi-sensory effects in high fidelity over large open worlds. The Unreal game engine was selected over Unity because of the superior photorealistic graphics quality required for The Virtual UCF Arboretum research portfolio, whereas Unity was selected for the AR apps due to the tighter and easier development pipeline with AR devices.
Terrain map data may be used to generate height maps to automatically generate virtual terrains for a game level environment. Environments without vegetation are simple to construct, such as a VR Mars, VR Moon, or VR Earth deserts or mountains. To create a lush and accurate forest, data for the plants are required. ESRI is the de facto standard database used in forestry and ecology. Such data are typically displayed as abstract dots on a 2D GIS map. Although accurate, such abstract data visualizations are not emotionally engaging, immersive, or experiential—all outcomes shown to be desirable for learning outcomes, such as those that simulate virtual field trips.
For this project to achieve the emotional and learning design objectives, the ESRI data had to be converted into virtual plants at a true scale and displayed in patterns representative of real ecological regions in the virtual terrain. The virtual arboretum project sourced real plant inventory and population distribution data for each of the 10 natural communities of interest in collaboration with the UCF Arboretum. Once the data were converted into information the game engine could process, the rendering and LOD tools automatically displayed the virtual plants on top of the virtual terrain. The pattern of all of the plants viewed together, while not representing unique individuals, matched the real plant population inventory and population densities as statistical extrapolations of the real natural environment. Thus, the virtual nature model presents an accurate pattern of foliage to the viewer. In order to increase the information fidelity, and when the line of sight was clear, drone images were used for the precise location of individual plant models, such as the waterlilies placed in exact locations on the pond.

5.2. Sourced Content: Facts, Educational Story, Photographs, and 3D Plant Models

The Virtual Field Guide, consisting of 35 unique plant species native to Florida for the arboretum and 35 unique plant species native to Pennsylvania for the Carnegie, served three functions. First, the plant facts, photographs, and educational stories were collaboratively sourced using the WordPress platform. The processes required the biologists to source, review, and vet all the data that supported the digital media team’s access to data on plant facts—such as plant height and spread, in addition to membership in natural communities—required to create accurate 3D plant models. The 3D plant models were shared by the digital media team on SketchFab.com, creating the project’s 3D Plant Atlas used to support the domain expert review, vet, and approval process.
Second, the Virtual Field Guide built in WordPress.com functions as a stand-alone educational website available to anyone with a web browser, offering broader impact and access. Aligned to national educational standards, it was published on PBS LearningMedia in 2019 and offers browsing and searching by common name, plant meta-data, plant Latin name, photographs, educational stories, and gardening facts. The 3D Plant Atlas component allows teachers nationwide to use any web browser-enabled device, or low-cost VR Google Cardboard and AR-enabled smartphones, to bring educational material about plants into their classrooms for free for an immersive botany lesson. Such an experience could be beneficial, especially for urban youth who are underrepresented in STEM fields without access to forests [28] to motivate interest in the science of the natural world and related careers [29].
The third function was to use the entire published website as the Virtual Field Guide in-game for learning in the moment of curiosity. Experienced in-game as a type of heads-up display (HUD), it allows a user to click on a plant to open a fact card about that plant, as well as to keep a journal of all plants found while engaged in the virtual field trip and access a 2D map of the entire arboretum as a navigation and wayfinding aid. The fact that the educational content is not hardcoded or embedded inside the game is an important design innovation. This design represents a web of knowledge connected to objects inside the game level that is programmatically independent and thus easy to maintain and extend.
The digital media team relied on data to construct accurate and realistic plant models and landscapes, representing high information fidelity. SketchFab.com was used to publish the 3D models, first for review in collaboration with the scientists and then online as a standalone atlas of the plant models available to the public to use as AR and VR plant models for educational activities. Although photogrammetry and laser-scanned point clouds (e.g., Megascans) will improve the future efficiency of capture and provide higher resolution, better detail, and higher complexity with higher polygon counts for the 3D models, the reviews with botanists are expected to be continued to ensure the accurate identification and level of detail required to accurately represent the model’s salient features. Each plant has been through an iterative, co-creation, and review process with a botanist or biologist and artists in the digital media team to vet and approve it prior to release. The scientists worked with digital media artists to review, correct errors in the 3D models, and agree on the level of information fidelity required prior to release and use. All of the AR and VR photorealistic 3D plant models are botanically correct.
The ambient bioacoustics were gathered and reviewed in a similar matter and are accurate representations of insects, amphibians, and birds found in the environment. Limited artificial life takes the form as animated avatars. The American eagle makes a sound more like a seagull than a hawk, but the hawk cry is used in many nature films because it is “prettier”. The cry in the virtual arboretum sounds as it should. These components, when combined, make a high information fidelity model of the real natural environment possible and open many lines for future research and development.

5.3. The Design and Construction Process

An interdisciplinary co-design process was used to integrate the requirements of scientists and artists to carefully create these artificial knowledge artifacts for use in the communication of materials that not only engage but also teach. Using a modified user-centered design (UCD) approach [30] to integrate the Expert-Learner-User eXperience (ELUX) design process [31], holistic requirements emerged, were captured, and were used for the application design specification (Figure 5).
This iterative co-design process required multiple feedback loops:
  • Scientific Review: iterative source–review–correct–validate process required to authenticate all data, information, facts, and concepts for approval and release by domain experts.
  • Aesthetic Review: iterative creative process to construct 3D models reviewed relative to the scientific facts and the technical constraints of the AR and VR platforms.
  • Aesthetic-Scientific Combined Review: iterative artistic-scientific process of all digital media reviewed for information accuracy relative to the expression of those facts in the models and visualizations with collaborative correction of errors, then a joint decision and approval for release.
  • User and Stakeholder Usability Tests: traditional iterative software application design and development process with attention to user experience, technical constraints, and digital distribution standards.
  • Learner Experience and Actual Learning Outcomes: UCF’s Institutional Review Board (IRB) approved learning impact studies to evaluate actual learning outcomes and evaluations.

5.4. Details of the ELUX Production Process

5.4.1. Data Gathering

The project started with a meeting with the arboretum biologists to gather data and maps for the plants and landscape. There are thousands of plants in the real arboretum, but only the most important ones needed to satisfy the educational objectives were selected for modeling. For each plant, an authoritative description was created, including the common name, Latin name, soil, water, sunlight requirements, other gardening facts, and membership in each natural community. Also created were educational stories that related to the field trip outreach programs, highlighting fun facts about each plant. Photographs, maps, data, plant inventories by species, and plant population densities were also collected. Then, the undergraduate research assistants in the digital media program, guided by the biologists, went into the field to locate, photograph, and measure each plant of interest.

5.4.2. The Natural Communities and Plant Content

The ten natural communities in the selected area of the real arboretum are Basin Swamp, Baygall, Depression Marsh, Dome Swamp, Mesic Flatwoods, Scrub, Scrubby Flatwoods, Waterbody, Wet Flatwoods, and Xeric Hammock.
The plants in those areas that were selected were: American Beauty Berry (Callicarpa americana), Blazing Star (Liatris tenuifolia), Bog Button (Lachnocaulon anceps), Bracken Fern (Pteridium aquilinum var. pseudocaudatum), Scarlet Calamint (Calamintha coccinea), Chalky Bluestem (Andropogon virginicus var. glaucus), Coastalplain Honeycombhead (Balduina angustifolia), Coontie (Zamia integrifolia), Dahoon Holly (Ilex cassine), Feay’s Palafox (Palafoxia feayi), Fetterbush (Lyonia lucida), Florida Paintbrush (Carphephorus corymbosus), Florida Tickseed (Coreopsis leavenworthii), Forked Bluecurls (Trichostema dichotomum), Fourpetal St. John’s Wort (Hypericum tetrapetalum), Goldenrod (Solidago stricta), Hooded Pitcher Plant (Sarracenia minor), Longleaf Pine (Pinus palustris), Lopsided Indiangrass (Sorghastrum secundum), Pond Cypress (Taxodium ascendens), Prickly Pear Cactus (Opuntia humifusa), Rayless Sunflower (Helianthus radula), Reindeer Moss (Cladonia spp.), Sabal Palm (Sabal palmetto), Sand Live Oak (Quercus geminata), Saw Palmetto (Serenoa repens), Scrub Rosemary (Ceratiola ericoides), Shortleaf Rosegentian (Sabatia brevifolia), Slender Flattop Goldenrod (Euthamia caroliniana), Spanish Moss (Tillandsia usneoides), Tall Jointweed (Polygonella gracilis), Titusville Balm (Dicerandra thinicola), Waterlily (Nymphaea odorata), Wax Myrtle (Myrica cerifera), and Wiregrass (Aristida stricta).

5.4.3. Field Research to Gather Data and Photographs

For each plant selected to be modeled, a portrait photograph of the plant in its location was taken first and used as a reference. Then, a cardboard screen was placed around the plant to surround both sides and the back to isolate it from the visual clutter of the background environment. This screen had a measurement grid inscribed on it to indicate height and width, that when photographed provided a future visual reference for use in the construction of the 3D polygon structures. Each petal, leaf, and part of the plant was then placed flat on a blue-screen cloth fabric to photograph each separate plant component (e.g., leaf, petal, stamen, pestle, any berries, nuts, or cones, and stems) (Figure 6a). These detailed studies of the organic forms composed the texture map library for each 3D plant model. Undergraduate research students developed advanced technical skills in working on complex organics and new knowledge about plants in this creative process.

5.4.4. Research to Gather Artificial Life Digital Media Assets and Ambient Bioacoustics

The insects, amphibians, birds, and fish include American bald eagles, green treefrogs (Hyla cinerea), largemouth bass, and many butterflies (e.g., monarch, gulf fertilary, xebra longwing, giant swallowtail), and damsels and dragonflies accurate to the location and time. These assets were gathered either directly from the environment or purchased online and verified with the scientific domain experts involved in the project.

5.4.5. The Virtual Field Guide Website

The WordPress engine platform was used to build the Virtual Field Guide website (Figure 7) and to support an interdisciplinary, geographically diverse, virtual team in collaboration. The different levels of project management, editorial, authorship, and readership roles were supported with the platform’s control of users’ roles. Multiple virtual team members in both the scientific and digital media teams were able to create, edit, approve, and publish content. The biologists provided data about each plant—such as the common name, Latin name, educational story, facts on size and spread, gardening factors, and natural community locations—for publication and for meta tag expression. The digital media team published the photographs and videos, in addition to embedding the 3D models once approved to complete the component. The interdisciplinary team worked on the design, production, and publication in a highly controlled iterative process that supported the sourcing, reviewing, approval, and release of information necessary for the accuracy of the facts. Factual accuracy was required for educational and learning objectives in the applications.
Photographs that were gathered in the field and authenticated by the scientists were published for each plant. The correct identification of each plant proved to be invaluable to the artists, who did not know the plant names. The digital media team used this information as a reference guide when building the virtual plants and environment, which was particularly important for setting each plant’s acceptable height ranges in-game to real-world scales. Conveniently, the university’s web support team standards allowed for the easy integration of the Virtual Field Guide website with the university’s arboretum website for maintenance after it was published. The production website is currently published and is an accessible resource for teachers on PBS LearningMedia indexed to national educational standards, and can be used in preparing for and reinforcing real-world field trips to the real arboretum for the university and local communities.

5.4.6. The 3D Plant Atlas Model Library

The photographs were processed with Photoshop, Quixel SUITE 2.0, and Substance Painter. Maya 2018 was used to create the 3D plant and flower models (Figure 8a). The 3D Maya models were converted to FBX file types and imported into the Unreal Engine with the textures. Several new techniques were applied to increase the realism, adding to realistic light reflection and translucency to enhance the realistic interaction of the sunlight and the leaves, while other bump-map techniques were used to enhance leaf and bark textures, improving the realism in both the detail and complexity of each plant. Once the plant 3D model met the quality standards of the biologists or botanists, the model was released to a public viewable 3D Plant Atlas on SketchFab.com (Figure 8b).
The art asset creation process relied on the Virtual Field Guide for data and the photographs collected in the field to build each 3D plant model. This process was a surprisingly steep learning curve for the undergraduate digital media students who had never created organics before. Much of the complexity in nature was difficult to see or perceive at first, for many did not know what to look for and relied heavily on the iterative review from the scientists for highly detailed feedback to correct errors and achieve an accurate and realistic virtual plant model. A few of the digital media modelers had to invest time in field research, such as drawing plants to better understand their structures. SketchFab.com provided the framework to share models privately and collaborate on feedback in real time. Each model’s polygon count was bounded to respect the real-time runtime constraints of the frame rate speeds required by the AR and VR platforms. Recent advancements in the Unreal Engine will relax this constraint, but not for AR due to many factors in mobile distribution. The construction of the 3D plant models required this dynamic collaborative iterative review process, both artistic and scientific, to ensure content accuracy and visual quality in this triangulated process to achieve a high information fidelity.

5.4.7. Selectable Plants and Interactive Points of Interest

Interaction required each virtual plant to register a user event and then to respond. When a user approached a plant, the cross-hairs turned bright green to indicate to invite inquiry. The user event to select the plant depended on the user interface; a mouse click on that plant; a finger tap on the iPad; or a line cast in VR. The plant responded by opening the Virtual Field Guide in-game (Figure 3) so that facts, concepts, and educational stories became visible. In order to achieve the desired user interaction with the content, all plant meshes in the level are of the type “Foliage Instanced Static Mesh”. A line is cast from the player to where they are looking 10 times per second. This line checks for geometry collision. If it is hit, a mouse click event is registered to a widget to open the plant information displayed in the HUD from the external website based on the name of the mesh itself. In this way, 3D objects in-game expose semantic meta-data for real-time external data interaction in-game. The desktop application version supported direct manipulation between the mouse, pointer, cross-hairs, and object in-game. The VR application supported a 3D UI as a HUD in-game as a widget that rotated to face the player and respond to user input from the hand controllers casting a virtual line from the virtual hand to the target on the HUD menu.

5.4.8. Terrain Maps and Data File Import

The game level required a virtual terrain scaled to match the real terrain to produce an immersive experience accurate in scale to all real-world measurements and geo-locations. Topographical terrain data were sourced from the U.S. Geological Survey (USGA) database and an older map with greater landmark detail, in addition to a high-resolution aerial photograph, to provide landmark references (Figure 9). These data formats as provided could not be used to automatically generate a terrain in Unreal Engine. Therefore, the development team relied on terrain data from TerrainParty.com, in the form of a PNG download, to automatically generate the game-level terrain (Figure 10b). The minimum sample file size in this service is 8 km2, so the 1 km2 sample used for the arboretum was extracted from the original PNG. This extracted sample was scaled up to the same resolution as the landscape in Unreal Engine. This made each pixel in the PNG equal to one square meter in the engine. To resolve the issue of sharp angles between tiles on the landscape, the imported PNG was also filtered with a Gaussian blur of 1 pixel to smooth the surface.

5.4.9. The Arboretum Natural Communities Plant Data and Unreal Level Terrain

In order for Unreal Engine to render each virtual plant in the correct location in-game on the virtual terrain, the real-world geo-spatial plant data had to be indexed to the virtual terrain coordinates. The identification and number of plants covering an area had to be sourced and then transformed. The plant data stored in the arboretum GIS ESRI database originated from biologists who gathered data in real field-work on plant inventories (e.g., the list of plant species) and plant population densities (e.g., how many in an area), using a plant plot study data sampling method, a standard data gathering method used in biology and forestry. This method uses a linear transept (e.g., measuring a line over the land of some distance) to guide the positioning of rectangular grids (e.g., plastic piping connected to construct a square or a frame to lay on the ground), which are used to record the location for future data gathering efforts and to guide the biologists regarding the plants to include in the count. This approach was applied to each of the natural communities in the real arboretum to create a type of plant inventory over time and by location. Then, all field data were entered into a long-term GIS database using ESRI to chart changes in biodiversity over time. The arboretum used ESRI to generate a color-coded map to visualize each of the natural communities (e.g., basin swamp, baygall, depression marsh, dome swamp, mesic flatwoods, scrub, scrubby flatwoods, waterbody, wet flatwoods, and xeric hammock) that were then indexed to the plant inventories and plant population densities (Figure 10a). The digital media development team selected a 1000 × 1000 m section of that GIS map to model in Unreal (e.g., GIS map shown in Figure 10a, with the area selected to model indicated by a red rectangle, and the corresponding Unreal terrain level in Figure 10b).
The data on each plant’s population density for each of the 10 natural communities were tabulated in a table. The dimensions and measurements in meters used for the original plant plot studies were converted to centimeters for Unreal. The real plant plot studies were converted to a virtual 10 × 10 m virtual plot, representing the real percentages of coverage to render the 3D plant models in the game level using the same percent coverage area associated with each natural community. Each of the 35 3D plant models were then indexed to one or more of the ten natural communities by population density for an extrapolation over that region. Each natural community is represented as an irregular shape on the map (Figure 10a), but within that surface area, the virtual 10 × 10 m pattern of plants is generated in that area to match the real distribution pattern. The plant population data set for each plant in each natural community was converted to a percentage value for use in Unreal as a type of foliage coverage for each virtual natural community mapped to each zone (Table 1).
For example, in the real environment the dome swamp area of the map (Figure 10a) is represented by a turquoise circle with green diagonal lines and labeled number 7 on the map (Figure 10a). The plant population database showed a 20% coverage of that area by the pond cypress (Taxodium ascendens) tree. This value was used to the set the density parameter within Unreal engine’s foliage tool to randomly render the 3D model (Figure 11a) of the virtual pond cypress tree over 20% of the virtual dome swamp area in the Unreal terrain, with the height and spread variance reflective of each plant species (e.g., data from the Virtual Field Guide). Each foliage asset has a density value which was left at 100 instances per 1000. The result was a pattern of plants in each natural community that represented the real pattern of plants in the real environment (Figure 11b). In this way, the ESRI geospatial plant data were visualized as a virtual landscape of 3D plants in-game over the entire terrain area.

5.4.10. Drone Photographs Increased the Information Fidelity

When individual plants could be identified from the aerial drone photographs, their locations were mapped to the terrain to increase the information fidelity. For example, the drone flyover line of sight was clear over the pond where individual waterlilies (Nymphaea odorata) could be identified from the drone photograph (Figure 12a). Then, each individual 3D waterlily plant model (Figure 12b) was accurately placed to match its real location in the virtual waterbody (Figure 12c).

5.4.11. Alpha Maps, Unreal Settings for Plant Population Data Sets, and Map Locations

Black and white alpha maps were created for each separate zone (e.g., natural community) using the GIS map of the arboretum. These alpha maps were imported as textures in Unreal engine, and materials were made from them. The approach was not easy or direct, requiring multiple steps. Procedural tools, now available as third-party game engine plug-ins, make this easier to accomplish. These materials were applied to a cube mesh, which was flattened and scaled on the X and Y axis to match the landscape. The brush paint density value is a multiplier for the foliage painting density, (e.g., 2% real plant density would translate to 0.02 paint density rendered on the virtual terrain), and thus the plants are accurately represented per square meter in this way. Future research could automate these procedures and construction methods.

6. Results: Immersive Data Visualizations of Nature

The in-game visualization is based on the data from the plant inventory and plot density populations, as shown in Table 1. For each natural community listed across the top row and for each plant shown in the far-left column, there is a percent value representing each plant’s population density in each cell of the table that represents the area of coverage based on a 10 × 10 m area. For example, the plant data for the mesic flatwoods in Table 1 show the most dominate plant to be the saw palmetto, with an area coverage of 20.0%, followed by the fetterbush at 5.0%. The wax myrtle and chalky bluestem are at 4.0%, the goldenrod is at 3.0%, and the longleaf pine and wiregrass are at 2.5%, followed by many other plants in smaller percentages that are each rendered on the virtual terrain according to their percentage of real area covered. The plants that were not recorded from the field plot study inventory are not rendered, such as the bog button and the waterlily, and show a plant inventory percent of 0.0%. The visualization of this data is shown in Figure 13.
Although most of the image is dominated by the saw palmetto, there are sparse representations of pink-lavender blazing star and bright yellow coastalplain honeycombhead flowers, with many longleaf pines in view due to their height. The same approach was taken to visualize all ten natural communities (Figure 14) presented in Table 1. When viewed together and combined with the table of data, this represents a new method and process of constructing realistic virtual environments reflective of the data in GIS databases of plants and with the different patterns of naturally occurring vegetation. The higher information fidelity results in a higher-quality graphical presentation of the content, required when the content is important for informal learning activities such as virtual field trips and field-work in ecology education. It is also important for many lines of future research on immersive informal learning application design to understand human–computer environment interactions more broadly.

7. Conclusions: Virtual Nature, Information Fidelity, and Immersive Informal Learning

There is a confluence of data visualization processes coming together to produce information-rich, immersive, multimodal, and interactive virtual environments of natural landscapes in game engines. This paper introduced a new concept, information fidelity, as the defining design factor that is important in order to extend the impact of photorealism, immersion, embodiment, engagement, and presence, and especially important when designing and constructing a virtual field trip of nature for informal learning applications.
These new virtual environments of virtual nature cross the boundary into data visualization applications because they are constructed from an information fusion of GIS data, different than traditional game art or the 2D 360 video that was used in the past. The design, methods, and processes presented in this paper demonstrate how an immersive data visualization of the geo-spatial GIS plant data may be rendered in a game engine with high information fidelity to achieve sensory accuracy, and thus photorealism, more efficiently than prior environmental art creation methods and processes.
Non-entertainment applications of game engines have been used as general-purpose, real-time, interactive computer graphics tools, especially for cultural heritage and informal learning applications and in modeling, simulation, and training. However, few have developed virtual field trips to inform about the natural world by truly exploiting the design affordances of new immersive technologies to construct and present a complete virtual environment in VR or projected on top of the real environment at almost identical levels of photorealism in AR—the virtual forest merges with the real forest. Additionally, they act as a shared teaching and learning knowledge artifact, similar to natural history museum dioramas and exhibits of the past, while using game engines as general-purpose tools. As such, they become interactive and multimodal, supporting personalized informal learning. Most AR applications only overlay a 2D annotation consisting of text, photograph, video, or 3D model on top of a real object, or, in VR applications, use environments that are relatively small in scale, thus limiting wide expansive open-world free exploration driven by free choice and curiosity.
Presented here is an example of a high information fidelity environment that also offers free exploration of a large landscape for individuals’ experiential learning. The ELUX design process increases the accuracy, assurance, and completeness of scientific information by integrating the domain experts’ knowledge into the virtual environment as a data visualization and augmentation to the real world. That artifact is capable of transmitting an array of multisensory signals to the human to perceive as a new type of artificial knowledge artifact. The domain experts’ imagination is made real, tangible, and transferable to a non-expert, offering direct and accurate experience of the reality of nature. Perception is a simultaneous top-down and bottom-up process. For a non-expert without deep domain knowledge, they cannot “see” the world in the same detail, complexity, or accuracy as the expert. These applications transmit a sensory experience that only an expert can imagine based on the deep knowledge of the expert, thus allowing the non-expert to see and experience a reality they cannot possibly otherwise imagine, a type of knowledge-imagination machine. Harnessing the immersive power of theoutput devices, they amplify presence, embodiment, and engagement. Shown, too, are the emotional reactions to beautiful, salient, and peaceful places, triggering individual curiosity, exploration, and inquiry, which are important in developing the non-expert’s individual mind relative to contrasts they directly perceive in reality and not dependent on socially constructed views that might be full of misinformation.
As such they represent an opportunity for future research and development into scientific communication, important to all science, technology, engineering, and mathematics (STEM) fields of interest, and informal learning research broadly, especially when authentic human reactions are important outcomes to expansive, complex, unknowns, triggering awe and wonder. It is also a different way of viewing “immersive” AR and VR application design and development, which are important when the desired outcomes are driven by learning, education, and scientific communication goals surrounding topics important to botany, biology, ecology, forestry, or geology. The application is useful for natural history museums, botanical gardens, or arboretums with educational objectives, and can support field trips or field-work training. Although some can appreciate the technical expertise or “craft” of this work, scientists trained in botany, biology, and ecology can appreciate the knowledge, and artists who study the natural world are awestruck by the beauty. Others will see the opportunity to automate these methods for new algorithms and to create new solutions to model and simulate the natural world. As such, these virtual nature models may support new multimodal knowledge visualizations of past and future realities, and data-driven realities of alternate scenarios of the present. These models could support the creation of a new immersive digital twin of the Earth that will be useful for integrating information into models already created for human-made systems, and for sustainable urban development as a means to optimize and visualize a more complete system of dynamic models.

Supplementary Materials

More information is available online at http://mariacrharrington.org (accessed on 8 April 2021), and videos on YouTube: https://www.youtube.com/channel/UC41UPaOORst2jh7XWwskYXw (accessed on 8 April 2021). The applications are available for download: https://the-harrington-lab.itch.io. The Virtual UCF Arboretum requires a Windows PC and graphics card, and the AR Perpetual Garden Apps for Apple and Android devices, (allow the camera to find the floor, then tap the screen to plant the garden).

Author Contributions

Design, conceptualization, methodology, and project management, M.C.R.H.; art, C.J. and Z.B.; software, J.M. and T.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Related and referenced research studies are listed below: The AR app study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of The University of Central Florida (protocol code IRB ID: Study00000434, on 20 June 2019). The VR Treadmill study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of The University of Central Florida (protocol code IRB ID: Study00001980, on 5 August 2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the related and referenced studies.

Acknowledgments

The Virtual UCF Arboretum project is under the direction of Maria C. R. Harrington. It is an augmented reality (AR) and virtual reality (VR) collaboration between The Harrington Lab in the Nicholson School of Communication and Media and the UCF Arboretum and Department of Biology and the Landscape and Natural Resources team at the University of Central Florida. We would like to thank the UCF Director of Landscape and Natural Resources and Arboretum, Patrick Bohlen, and his team, Jennifer Elliott, Ray Jarrett, Amanda Lindsay, John Guziejka, and Rafael Pares, for their knowledge, support, and partnership in this collaboration. We would also like to thank the UCF Digital Media students involved in the development effort: Zachary Bledsoe, Chris Jones, James Miller, and Thomas Pring, as well as those who helped on the website plant atlas: Kellie Beck, and other students, including Seun (Oluwaseun) Ademoye, Yasiman Ahsani, Jeremy Blake, Bailey Estelle, Sebastian Hyde, Katherine Ryschkewitsch, Marchand Venter, and Aubri White. The Virtual UCF Arboretum was released on October 2018 as an Unreal level, and in November 2018 for the HTC Vive. The Virtual UCF Arboretum is available for free download: https://the-harrington-lab.itch.io/the-virtual-ucf-arboretum (accessed on 15 March 2021). The AR Perpetual Garden App was developed in part as an international collaboration between The Harrington Lab at the University of Central Florida, The Powdermill Nature Reserve at the Carnegie Museum of Natural History, and the MultiMediaTechnology program of the Salzburg University of Applied Sciences, Austria. Undergraduate and graduate students were involved in the production of the app. The multidisciplinary effort was developed with my partners, John W. Wenzel, Director, Powdermill Nature Reserve, CMNH, and Markus Tatzgern, FH-Professor, Head of Game Development and Mixed Reality, MultiMediaTechnology program of the Salzburg University of Applied Sciences, Austria. The AR apps are available for free download: https://the-harrington-lab.itch.io/the-ar-perpetual-garden-apps (accessed on 15 March 2021). A very special acknowledgement must go to my dear friends and mentors, John Wenzel and John Weishampel, for their insight, creativity, and support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Harrington, M.C.R. The Virtual Trillium Trail and the Empirical Effects of Freedom and Fidelity on Discovery-based Learning. Virtual Real. 2012, 16, 105–120. [Google Scholar] [CrossRef]
  2. Harrington, M.C.R. Empirical Evidence of Priming, Transfer, Reinforcement, and Learning in the Real and Virtual Trillium Trails. IEEE Trans. Learn. Technol. 2011, 4, 175–186. Available online: https://www.computer.org/csdl/journal/lt/2011/02/tlt2011020175/13rRUxjQyrq (accessed on 15 March 2021). [CrossRef]
  3. Harrington, M.C.R. Virtual dioramas transform natural history museum exhibit halls & gardens to life with immersive AR. In Proceedings of the 2020 ACM Interaction Design and Children Conference: Extended Abstracts, London, UK (virtual), 17–24 June 2020; Association for Computing Machinery: New York, NY, USA; pp. 276–279. [Google Scholar] [CrossRef]
  4. Harrington, M.C.R.; Tatzgern, M.; Langer, T.; Wenzel, J.W. Augmented reality brings the real world into natural history dioramas with data visualizations and bioacoustics at the Carnegie Museum of Natural History. Curator Mus. J. 2019, 62, 177–193. First published in Wiley Online Library (19 April 2019). [Google Scholar] [CrossRef]
  5. Harrington, M.C.R. Observation of Presence in an Ecologically Valid Ethnographic Study Using an Immersive Augmented Reality Virtual Diorama Application. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA (virtual), 22–26 March 2020; pp. 813–814. [Google Scholar] [CrossRef]
  6. Harrington, M.C.R. Connecting User Experience to Learning in an Evaluation of an Immersive, Interactive, Multimodal Augmented Reality Virtual Diorama in a Natural History Museum & the Importance of Story. In Proceedings of the iLRN 2020: 6th International Conference of the Immersive Learning Research Network, San Luis Obispo, CA, USA, 21–25 June 2020. [Google Scholar] [CrossRef]
  7. Eberbach, C.; Crowley, K. From seeing to observing: How parents and children learn to see science in a botanical garden. J. Learn. Sci. 2017, 26, 608–642. [Google Scholar] [CrossRef]
  8. McCauley, D.J. Digital Nature: Are Field Trips a Thing of the Past? Science 2017, 358, 298–300. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Slater, M. Place Illusion and Plausibility Can Lead to Realistic Behaviour in Immersive Virtual Environments. Philos. Trans. R. Soc. Lond. Ser. B Biol. Sci. 2009, 364, 3549–3557. [Google Scholar] [CrossRef] [Green Version]
  10. Bowman, D.A.; McMahan, R.P. Virtual reality: How much immersion is enough? IEEE Comput. 2007, 40, 36–43. [Google Scholar] [CrossRef]
  11. Kamarainen, A.M.; Metcalf, S.; Grotzer, T.; Browne, A.; Mazzuca, D.; Tutwiler, S.M.; Dede, C. EcoMOBILE: Integrating augmented reality and probeware with environmental education field trips. Comput. Educ. 2013, 68, 545–556. [Google Scholar] [CrossRef] [Green Version]
  12. Lindgren, R.; Tscholl, M.; Wang, S.; Johnson, E. Enhancing learning and engagement through embodied interaction within a mixed reality simulation. Comput. Educ. 2016, 95, 174–187. [Google Scholar] [CrossRef] [Green Version]
  13. Hughes, C.; Stapleton, C.; Hughes, D.; Smith, E. Mixed reality in education, entertainment, and training. IEEE Comput. Graph. Appl. 2005, 25, 24–30. [Google Scholar] [CrossRef] [PubMed]
  14. Fullerton, T.; The Walden Team. Walden, a Game. Available online: https://www.waldengame.com/ (accessed on 20 January 2021).
  15. Roussou, M.; Katifori, A. Flow, Staging, Wayfinding, Personalization: Evaluating User Experience with Mobile Museum Narratives. Multimodal Technol. Interact 2018, 2, 32. [Google Scholar] [CrossRef] [Green Version]
  16. Micoli, L.; Giandomenico, C.; Gabriele, G. Design of Digital Interaction for Complex Museum Collections. Multimodal Technol. Interact. 2020, 4, 31. [Google Scholar] [CrossRef]
  17. Köster, M.; Kärtner, J. Context sensitive attention is socialized via a verbal route in the parent-child interaction. PLoS ONE 2018, 13, e0207113. [Google Scholar] [CrossRef]
  18. Marques, D.; Costello, R. Concerns and Challenges Developing Mobile Augmented Reality Experiences for Museum Exhibitions. Curator Mus. J. 2018, 6, 541–558. [Google Scholar] [CrossRef]
  19. Zimmerman, H.T.; Land, S.M.; McClain, L.R.; Mohney, M.R.; Choi, G.W.; Salman, F.H. Tree Investigators: Supporting families’ scientific talk in an arboretum with mobile computers. Int. J. Sci. Educ. Part B Commun. Public Engagem. 2015, 5, 44–67. [Google Scholar] [CrossRef]
  20. Taylor, G.L.; Disinger, J.F. The Potential Role of Virtual Reality in Environmental Education. J. Environ. Educ. 1997, 28, 38–43. [Google Scholar] [CrossRef]
  21. Cheng, K.H.; Tsai, C.C. A case study of immersive virtual field trips in an elementary classroom: Students’ learning experience and teacher-student interaction behaviors. Comput. Educ. 2019, 140. [Google Scholar] [CrossRef]
  22. Parisa, Z.; Nova, M. This AI can see the forest and the trees. IEEE Spectr. Aug. 2020, 57, 32–37. [Google Scholar] [CrossRef]
  23. Shneiderman, B.; Plaisant, C.; Cohen, M.; Jacobs, S.; Elmqvist, N.; Diakopoulos, N. Designing the User Interface: Strategies for Effective Human-Computer Interaction, 6th ed.; Pearson Education Limited: London, UK, 2018; pp. 552–569. [Google Scholar]
  24. Mallavarapu, A.; Lyons, L.; Uzzo, S.; Thompson, W.; Levy-Cohen, R.; Slattery, B. Connect-to-Connected Worlds: Piloting a Mobile, Data-Driven Reflection Tool for an Open-Ended Simulation at a Museum. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19), Glasgow, UK, 4–9 May 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 7, 1–14. [Google Scholar] [CrossRef]
  25. Inside Tumucumaque. Interactive Media Foundation gGmbH Reichenberger Str. 88 10999 Berlin/Germany. Available online: https://www.inside-tumucumaque.com/#start (accessed on 20 January 2021).
  26. Zhao, J.; LaFemina, P.; Carr, J.; Sajjadi, P.; Wallgrün, J.O.; Klippel, A. Learning in the Field: Comparison of Desktop, Immersive Virtual Reality, and Actual Field Trips for Place-Based STEM Education. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 893–902. [Google Scholar] [CrossRef]
  27. Markowitz, D.; Laha, R.; Perone, B.; Pea, R.; Bailenson, J. Immersive Virtual Reality Field Trips Facilitate Learning About Climate Change. Front. Psychol. 2018, 9, 2364. [Google Scholar] [CrossRef]
  28. Robert, D.; Bixler, R.D.; Carlisle, C.L.; Hammltt, W.E.; Floyd, M.F. Observed Fears and Discomforts among Urban Students on Field Trips to Wildland Areas. J. Environ. Educ. 1994, 26, 24–33. [Google Scholar] [CrossRef]
  29. Beltran, R.S.; Marnocha, E.; Race, A.; Croll, D.A.; Dayton, G.H.; Zavaleta, E.S. Field courses narrow demographic achievement gaps in ecology and evolutionary biology. Ecol. Evol. 2020, 10, 5184–5196. [Google Scholar] [CrossRef] [PubMed]
  30. Norman, D.A.; Draper, S.W. (Eds.) User Centered System Design: New Perspectives on Human-Computer Interaction; Lawrence Erlbaum Associates, Inc.: Hillsdale, NJ, USA, 1986. [Google Scholar]
  31. Harrington, M.C.R. Augmented and Virtual Reality Application Design for Immersive Learning Research Using Virtual Nature: Making Knowledge Beautiful and Accessible with Information Fidelity. In Proceedings of the Special Interest Group on Computer Graphics and Interactive Techniques Conference Talks (SIGGRAPH ’20 Talks), Virtual Event, Washington, DC, USA, 17 August 2020; ACM: New York, NY, USA, 2020; p. 2. [Google Scholar] [CrossRef]
Figure 1. The Virtual UCF Arboretum, the waterbody view as an example of an immersive informal learning virtual nature application, October 2018.
Figure 1. The Virtual UCF Arboretum, the waterbody view as an example of an immersive informal learning virtual nature application, October 2018.
Mti 05 00018 g001
Figure 2. Image of an immersive augmented reality virtual woodland springtime bloom as it complements the real Forest Diorama in the Carnegie Museum of Natural History Hall of Botany, from a 2019 pilot study on informal learning effects. Inserted image shows a family interacting with the AR application, demonstrating engagement, embodiment, and presence.
Figure 2. Image of an immersive augmented reality virtual woodland springtime bloom as it complements the real Forest Diorama in the Carnegie Museum of Natural History Hall of Botany, from a 2019 pilot study on informal learning effects. Inserted image shows a family interacting with the AR application, demonstrating engagement, embodiment, and presence.
Mti 05 00018 g002
Figure 3. Image showing the integration of the interactive virtual plant field guide (e.g., HUD displaying an external and independent website) with the virtual nature game level in Unreal. Users may select a plant to open the virtual field guide for learning in-game.
Figure 3. Image showing the integration of the interactive virtual plant field guide (e.g., HUD displaying an external and independent website) with the virtual nature game level in Unreal. Users may select a plant to open the virtual field guide for learning in-game.
Mti 05 00018 g003
Figure 4. Images showing the virtual arboretum application implemented in different output devices: (a) Research assistants help children in a science center exhibition to experience the virtual arboretum as an immersive virtual field trip in 2019; (b) research assistant walking on a treadmill and navigating through the projected virtual arboretum in a lab for study, 2019.
Figure 4. Images showing the virtual arboretum application implemented in different output devices: (a) Research assistants help children in a science center exhibition to experience the virtual arboretum as an immersive virtual field trip in 2019; (b) research assistant walking on a treadmill and navigating through the projected virtual arboretum in a lab for study, 2019.
Mti 05 00018 g004
Figure 5. Conceptual representation of the Expert-Learner-User eXperience (ELUX) design process as an expanded user-centered design process used to integrate the learning objectives of individual user tasks and goals when building immersive learning applications.
Figure 5. Conceptual representation of the Expert-Learner-User eXperience (ELUX) design process as an expanded user-centered design process used to integrate the learning objectives of individual user tasks and goals when building immersive learning applications.
Mti 05 00018 g005
Figure 6. Photographs of student field research efforts: (a) students photographing a pinecone; (b) composite image representing information captured about each plant in the field.
Figure 6. Photographs of student field research efforts: (a) students photographing a pinecone; (b) composite image representing information captured about each plant in the field.
Mti 05 00018 g006
Figure 7. The Virtual Field Guide front webpage.
Figure 7. The Virtual Field Guide front webpage.
Mti 05 00018 g007
Figure 8. The 3D Plant Atlas Model Library: (a) an example of a 3D interactive plant model shown—forked bluecurls (Trichostema dichotomum); (b) all of the 3D models are available on SketchFab.com.
Figure 8. The 3D Plant Atlas Model Library: (a) an example of a 3D interactive plant model shown—forked bluecurls (Trichostema dichotomum); (b) all of the 3D models are available on SketchFab.com.
Mti 05 00018 g008
Figure 9. Topographical data and maps used for reference: (a) USGA topographical map for central Florida; (b) a high-resolution area photograph of the arboretum; (c) an older USGA map with greater terrain detail and information.
Figure 9. Topographical data and maps used for reference: (a) USGA topographical map for central Florida; (b) a high-resolution area photograph of the arboretum; (c) an older USGA map with greater terrain detail and information.
Mti 05 00018 g009
Figure 10. Data on plant inventory, population densities, and location by natural community are represented in the GIS map that corresponds to the geo-spatial location of the Unreal terrain-level map. A 1000 × 1000 m area of the arboretum was selected from the map, representing ten natural communities for a cross-section of biodiversity. The trails, ponds, and natural communities of the real arboretum were matched to the virtual one for correct location, scale, and dimensions, and then rendered to the Unreal terrain-level surface: (a) the arboretum GIS map representing all natural communities by a color-coded location; (b) Unreal terrain-level representative of the real arboretum terrain, trails, and waterbodies.
Figure 10. Data on plant inventory, population densities, and location by natural community are represented in the GIS map that corresponds to the geo-spatial location of the Unreal terrain-level map. A 1000 × 1000 m area of the arboretum was selected from the map, representing ten natural communities for a cross-section of biodiversity. The trails, ponds, and natural communities of the real arboretum were matched to the virtual one for correct location, scale, and dimensions, and then rendered to the Unreal terrain-level surface: (a) the arboretum GIS map representing all natural communities by a color-coded location; (b) Unreal terrain-level representative of the real arboretum terrain, trails, and waterbodies.
Mti 05 00018 g010
Figure 11. The 3D plant models were used to visualize a 3D geospatial pattern of plants, instead of using abstract dots on a 2D GIS map: (a) 3D model of the pond cypress tree; (b) virtual dome swamp natural community area in-game showing the 3D pond cypress tree dispersed over 20% of the bounded area to reflect the 20% plant plot density data from the ESRI database. To enhance realism, each model is randomly varied by height, spread, and rotation to enhance realism.
Figure 11. The 3D plant models were used to visualize a 3D geospatial pattern of plants, instead of using abstract dots on a 2D GIS map: (a) 3D model of the pond cypress tree; (b) virtual dome swamp natural community area in-game showing the 3D pond cypress tree dispersed over 20% of the bounded area to reflect the 20% plant plot density data from the ESRI database. To enhance realism, each model is randomly varied by height, spread, and rotation to enhance realism.
Mti 05 00018 g011
Figure 12. When the line of sight was clear to locate and identify a plant identity and location, the geo-spatial information was used to increase the information fidelity of the visualization: (a) drone photograph of the waterbody; (b) waterlily (Nymphaea odorata) 3D plant model; (c) the waterbody in-game, reflecting the exact locations of individual waterlilies at the time of the drone photograph.
Figure 12. When the line of sight was clear to locate and identify a plant identity and location, the geo-spatial information was used to increase the information fidelity of the visualization: (a) drone photograph of the waterbody; (b) waterlily (Nymphaea odorata) 3D plant model; (c) the waterbody in-game, reflecting the exact locations of individual waterlilies at the time of the drone photograph.
Mti 05 00018 g012
Figure 13. Image of the virtual mesic flatwoods, reflective of the plant inventory and plot density distribution data, a 3D geo-spatial visualization of the landscape.
Figure 13. Image of the virtual mesic flatwoods, reflective of the plant inventory and plot density distribution data, a 3D geo-spatial visualization of the landscape.
Mti 05 00018 g013
Figure 14. Images representing all ten natural communities plant plot densities visualized: (a) basin swamp; (b) baygall; (c) depression marsh; (d) dome swamp; (e) mesic flatwoods; (f) scrub; (g) scrubby flatwoods; (h) waterbody; (i) wet flatwoods; and (j) xeric hammock.
Figure 14. Images representing all ten natural communities plant plot densities visualized: (a) basin swamp; (b) baygall; (c) depression marsh; (d) dome swamp; (e) mesic flatwoods; (f) scrub; (g) scrubby flatwoods; (h) waterbody; (i) wet flatwoods; and (j) xeric hammock.
Mti 05 00018 g014aMti 05 00018 g014b
Table 1. Plant inventory and plot population density by natural community membership.
Table 1. Plant inventory and plot population density by natural community membership.
Plant NameNatural Community Name
Basin SwampBaygallDepression MarshDome SwampMesic FlatwoodsScrubScrubby FlatwoodsWaterbodyWet FlatwoodsXeric Hammock
American beautyberry0.0%0.0%0.0%0.0%1.0%0.0%0.0%0.0%0.0%0.0%
Blazing star0.0%0.0%1.0%0.0%0.5%1.0%1.5%0.0%1.5%0.0%
Bog Button0.0%0.0%4.0%0.0%0.0%0.0%0.0%0.0%0.0%0.0%
Bracken Fern0.0%0.0%0.0%4.0%2.0%0.5%2.5%0.0%2.5%0.0%
Calamintha coccinea0.0%0.0%0.0%0.0%1.0%1.5%1.5%0.0%1.5%0.0%
Chalky bluestem1.0%0.0%2.5%4.0%4.0%1.5%5.0%0.0%5.0%0.0%
Coastalplain honeycombhead0.0%0.0%0.0%0.0%0.5%0.0%0.5%0.0%0.5%0.0%
Coontie0.0%0.0%0.0%0.0%1.0%0.0%0.0%0.0%0.0%0.0%
Dahonn Holly1.5%2.5%2.5%1.5%0.5%0.0%0.5%0.0%1.0%0.0%
Feays Palafox1.0%1.0%2.0%0.5%1.0%0.0%0.0%0.0%1.0%0.0%
Fetterbush7.5%5.0%0.0%4.0%5.0%4.0%7.5%0.0%7.5%5.0%
Florida paintbrush0.0%0.0%0.5%0.0%1.0%1.0%1.0%0.0%1.0%0.0%
Florida Tickseed1.0%0.0%1.5%1.0%2.0%1.5%2.0%0.0%1.5%1.0%
Forked Bluecurls0.0%0.0%0.0%0.0%1.5%0.5%1.5%0.0%1.5%0.0%
Fourpetal St. John’s Wort1.5%1.0%4.0%2.0%2.5%0.0%0.0%0.0%2.0%0.0%
Goldenrod0.5%1.0%2.0%2.5%3.0%1.0%2.5%0.0%3.5%2.5%
Hooded Pitcher Plant0.5%0.5%1.0%1.0%0.5%0.0%0.0%0.0%0.5%0.0%
Longleaf Pine0.0%0.0%0.0%0.0%2.5%1.5%5.0%0.0%5.0%1.5%
Lopsided Indiangrass0.0%0.0%1.0%0.0%1.0%1.0%1.0%0.0%1.0%1.0%
Pond Cypress2.5%2.5%6.0%20.0%1.0%0.0%0.0%0.0%1.0%1.0%
Prickly Pear Cactus0.0%0.0%0.0%0.0%1.0%2.0%2.0%0.0%0.5%1.0%
Rayless Sunflower1.0%0.0%1.0%1.0%0.5%0.0%0.0%0.0%1.0%0.0%
Reindeer Moss0.0%0.0%0.0%0.5%1.5%5.0%5.0%0.0%1.5%7.5%
Sabal palm2.5%2.5%2.5%0.5%1.0%2.0%4.0%0.0%5.0%1.0%
Sand Live oak0.0%0.0%0.0%0.0%1.5%4.0%2.5%0.0%1.0%9.0%
Saw palmetto1.0%1.0%0.0%1.0%20.0%10.0%17.5%0.0%17.5%5.0%
Scrub Rosemary0.0%0.0%0.0%0.0%2.0%3.5%1.5%0.0%0.0%2.0%
Shortleaf rosegenetian0.5%0.0%1.5%0.0%1.0%1.0%1.0%0.0%1.5%0.5%
Slender Flattop Goldenrod1.0%1.0%7.5%1.5%2.0%0.0%1.0%0.0%2.5%0.0%
Tall Jointweed0.0%0.0%0.0%0.0%1.5%0.5%0.5%0.0%0.5%0.0%
Tarflower0.0%0.0%0.0%0.0%1.5%2.5%2.5%0.0%1.5%4.0%
Titusville Balm0.0%0.0%0.0%0.0%0.5%0.5%0.5%0.0%0.0%0.0%
Waterlily0.5%0.0%0.5%0.5%0.0%0.0%0.0%2.5%0.0%0.0%
Wax Myrtle2.5%2.5%2.5%2.5%4.0%0.5%1.5%0.0%2.5%0.0%
Wiregrass2.5%1.0%2.5%4.0%2.5%4.0%7.5%0.0%5.0%2.5%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Harrington, M.C.R.; Bledsoe, Z.; Jones, C.; Miller, J.; Pring, T. Designing a Virtual Arboretum as an Immersive, Multimodal, Interactive, Data Visualization Virtual Field Trip. Multimodal Technol. Interact. 2021, 5, 18. https://doi.org/10.3390/mti5040018

AMA Style

Harrington MCR, Bledsoe Z, Jones C, Miller J, Pring T. Designing a Virtual Arboretum as an Immersive, Multimodal, Interactive, Data Visualization Virtual Field Trip. Multimodal Technologies and Interaction. 2021; 5(4):18. https://doi.org/10.3390/mti5040018

Chicago/Turabian Style

Harrington, Maria C. R., Zack Bledsoe, Chris Jones, James Miller, and Thomas Pring. 2021. "Designing a Virtual Arboretum as an Immersive, Multimodal, Interactive, Data Visualization Virtual Field Trip" Multimodal Technologies and Interaction 5, no. 4: 18. https://doi.org/10.3390/mti5040018

Article Metrics

Back to TopTop