Next Article in Journal
Hurricane Precipitation Intensity as a Function of Geometric Shape: The Evolution of Dvorak Geometries
Previous Article in Journal
Spatiotemporal Analysis of Skier Versus Snowboarder Injury Patterns: A GIS-Based Comparative Study at a Large West Coast Resort
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Exploratory Learning of Amis Indigenous Culture and Local Environments Using Virtual Reality and Drone Technology

1
Institute of Learning Sciences and Technologies, National Tsing Hua University, Hsinchu 30013, Taiwan
2
Department of Environmental and Culture Resources, National Tsing Hua University, Hsinchu 30013, Taiwan
3
Coretronic Corporation, Hsinchu 300094, Taiwan
*
Author to whom correspondence should be addressed.
ISPRS Int. J. Geo-Inf. 2025, 14(11), 441; https://doi.org/10.3390/ijgi14110441 (registering DOI)
Submission received: 1 October 2025 / Revised: 5 November 2025 / Accepted: 7 November 2025 / Published: 8 November 2025
(This article belongs to the Topic 3D Documentation of Natural and Cultural Heritage)

Abstract

Virtual reality (VR) creates immersive environments that allow users to interact with digital content, fostering a sense of presence and engagement comparable to real-world experiences. VR360 technology, combined with affordable head-mounted displays such as Google Cardboard, enhances accessibility and provides an intuitive learning experience. Drones, or unmanned aerial vehicles (UAVs), are operated through remote control systems and have diverse applications in civilian, commercial, and scientific domains. Taiwan’s Indigenous cultures emphasize environmental conservation, and integrating this knowledge into education supports both biodiversity and cultural preservation. The Amis people, who primarily reside along Taiwan’s eastern coast and central mountain regions, face educational challenges due to geographic isolation and socioeconomic disadvantage. This study integrates VR360 and drone technologies to develop a VR learning system for elementary science education that incorporates Amis culture and local environments. A teaching experiment was conducted to evaluate its impact on learning effectiveness and student responses. Results show that students using the VR system outperformed the control group in cultural and scientific knowledge, experienced reduced cognitive load, and reported greater learning motivation. These findings highlight the potential of VR and drone technologies to improve learning outcomes, promote environmental and cultural awareness, and reduce educational barriers for Indigenous students in remote or socioeconomically disadvantaged communities.

1. Introduction

Taiwan lies between the tropical and subtropical zones and has long been a major cultural crossroads in East Asia. This strategic position has attracted diverse ethnic groups, each developing unique cultural identities over time. Among the earliest inhabitants of Taiwan are Indigenous peoples. However, with the prolonged dominance of external cultures and historical lack of written documentation, many Indigenous traditions and environmental knowledge have gradually eroded or disappeared. In response, there has been growing societal recognition, within Indigenous communities and broader Taiwanese society, of the urgent need to preserve their culture and environments [1]. Cultural background shapes students’ perceptions, motivations, and engagement in learning. Hence, addressing disparities in science education requires careful consideration of cultural contexts, particularly for students in Indigenous communities.
Among Taiwan’s Indigenous groups, the Amis (Amis language: Amis or Pangcah) constitute the largest population, primarily residing along the eastern coast and in the central mountains. Traditional Amis cultural practices, including music, dance, clothing, rituals, and clan structures, are rich in meaning and deeply connected to their living environments. Although Taiwanese education policies have endeavored to preserve Indigenous languages and cultural heritage, disparities in instructional design persist, particularly due to geographic and resource constraints. As technology advances, there is growing emphasis on integrating scientific concepts with Indigenous knowledge systems to bridge regional and cultural disparities in learning outcomes.
The Jingpu Tribe is located on the plains at the mouth of the Xiuguluan River, offering expansive views of Taiwan’s eastern coastline. The river mouth’s rich ecology has enabled the Amis people to preserve their traditional fishing and hunting practices. Known as the Sun Tribe for being the first to greet the sunrise, Jingpu, the southernmost settlement in Hualien County, is a significant prehistoric cultural site in Taiwan. The Jingpu Cultural Site in Fengbin has been preserved for over 1500 years, safeguarding the heritage of the Amis ancestors. For the Amis people, the Sun carries profound cultural significance, reflecting their deep connection with natural environments.
In Taiwan’s elementary science curriculum, astronomy is closely connected to everyday life, covering topics such as the Earth, Sun, stars, Moon, and seasonal changes [2]. Through astronomical observations, students can develop reflective thinking and metacognitive skills by exploring the relative motion of celestial bodies and drawing conclusions through scientific inquiry [3]. The Sun provides vital light and energy that shape daily life. Seasonal changes, driven by Earth’s tilt and orbit, influence the Sun’s path and the duration of daylight. In the Northern Hemisphere, the Sun moves northward at the summer solstice, travels along an east–west path during the equinoxes and shifts southward at the winter solstice. Traditional observation methods are limited by weather, time, and equipment, limiting study opportunities and student motivation.
With the rise of the metaverse, immersive technologies such as virtual reality (VR) and augmented reality (AR) have become increasingly prominent in education. VR creates a fully immersive 3D environment accessed through head-mounted displays, offering users a sense of presence, while AR overlays digital information onto the physical world via devices like smartphones or tablets. In this study, a low-cost VR solution was developed by combining smartphones with Google Cardboard headsets to present 360-degree panoramic images. This approach aims to enhance students’ learning motivation and create a stronger sense of presence compared to traditional instruction.
Conventional tools such as slideshows and textbooks often fail to immerse learners, especially when teaching abstract scientific concepts. These limitations can increase cognitive load, thereby hindering comprehension. In contrast, virtual environments viewed through mobile devices enable learners to control their viewpoints, creating a more interactive and immersive experience. Such tools reduce cognitive load by making learning more intuitive and engaging [4]. For instance, Araiza-Alba et al. used VR360 videos to teach children water safety skills through interactive scenarios [5], while Ou et al. employed VR simulations of tree frog habitats to reduce practical barriers and enhance learning outcomes in environmental education [6].
Current classroom practices in rural and Indigenous regions continue to rely on textbooks, projected media, and pre-recorded videos. By incorporating VR content, educators can offer more interactive and effective learning environments, especially compared to passive, paper-based materials [7]. This study aims to integrate Amis Indigenous culture with local environments by combining VR360 and drone technology to create an accessible digital learning tool. Using a smartphone with low-cost, portable Cardboard goggles, learners can engage in an immersive experience. Interactive feedback and autonomous exploration further enhance learner motivation and engagement.
A drone, or Unmanned Aerial Vehicle (UAV), is an aircraft that operates without an onboard pilot. It can be controlled remotely by an operator or autonomously and is widely used in civilian, commercial, and military applications [8]. Modern drones are typically equipped with high-resolution cameras, multiple sensors, and GPS, enabling applications such as aerial surveillance, terrain mapping, agricultural spraying, and material delivery, thereby enhancing operational efficiency and reducing labor costs [9].
Drones generally operate in two modes: (1) Remote control, where an operator uses radio signals or network systems to manage the drone’s flight path and tasks in real time; and (2) Autonomous flight, in which the drone follows pre-programmed routes and can leverage Artificial Intelligence (AI) to make real-time adjustments, such as avoiding obstacles or adapting flight strategies to environmental changes, offering greater flexibility and autonomy in task execution [10].
With advances in flight control, GPS, and communication technologies, drones are now widely used in disaster rescue, environmental monitoring, film production, and logistics. Modern drones transmit aerial data to support search and rescue, long-term environmental monitoring, and efficient material delivery [11]. In the military domain, drones were initially deployed for reconnaissance and surveillance. Their lightweight, flexible design enables undetected operations deep within enemy territory, delivering real-time battlefield intelligence to support precise tactical decisions [12]. Because of their high mobility, low cost, and autonomous capabilities, drones have become an essential technology across a wide range of contemporary applications.
This study integrated VR360 and drone technology into a VR learning system to teach Amis cultural heritage and local environments within the elementary science curriculum. The system also supports solar observation at the Tropic of Cancer Landmark near the Jingpu Tribe, allowing learners to measure the Sun’s azimuth and altitude angles in real time. Using smartphones paired with portable Cardboard goggles, students can interact with immersive content that enhances motivation and engagement [13]. By combining scientific knowledge with Amis cultural traditions, the VR learning system facilitates solar observation while overcoming the limitations of traditional instruction.
This study aims to reduce educational disparities in remote areas by examining the effectiveness of a VR learning system that integrates Amis Indigenous culture and local environments with elementary science curricula. Based on prior research and the study’s objectives, the following research questions are proposed:
(1)
What is the effectiveness of the VR system on students’ learning achievement?
(2)
How does the VR system influence students’ learning motivation?
(3)
What impact does the VR learning system have on students’ cognitive load?
(4)
To what extent are students satisfied with the VR learning system?

2. Literature Review

To establish the theoretical foundation of this study, the literature was reviewed across six domains: (1) VR as an immersive and interactive medium for simulating real-world contexts and supporting conceptual understanding; (2) drone technology as a tool for connecting classroom learning with authentic experiences in science, environmental studies, and cultural exploration; (3) solar trajectory observation as a means to promote hands-on learning and deepen students’ understanding of Earth’s rotation, axial tilt, and the cause of seasonal change; (4) situated learning, which emphasizes authentic tasks, collaboration, and scaffolding to foster meaningful knowledge construction; (5) learning motivation, focusing on the roles of interest, enjoyment, and self-determination in sustaining student engagement; and (6) cognitive load theory, which informs instructional design to minimize extraneous demands and optimize cognitive processing. Collectively, these perspectives guided the design of a VR learning system that integrates cultural and environmental contexts within a robust pedagogical framework.

2.1. Virtual Reality

Virtual reality is an interactive environment created through 3D computer graphics and immersive technologies, bridging the physical and digital worlds. Through computers or mobile devices, users can interact with virtual objects and experience highly realistic simulated environments [14]. VR also provides strong sensory stimulation and feedback, allowing users to immerse themselves in vivid interactions with the synthesized environment [15]. The earliest VR device, the head-mounted display (HMD), originated in the 1960s and was later promoted by the U.S. Department of Defense and NASA in 1989, which accelerated the commercialization of VR. Over the years, VR technology has matured and diversified in form and definition. Today, VR has been applied widely in many disciplines, such as firefighting, medicine, and the military, providing learners with safe environments to practice skills required in dangerous real-world situations.
VR hardware can be classified into several types based on their presentation formats. Desktop VR provides relatively low immersion but is cost-effective and easy to set up, relying on interaction through a keyboard, mouse, joystick, or touchscreen. Simulation VR, among the earliest systems, integrates hardware with authentic or virtual environments to replicate real-world operations, such as medical training [16]. Projection VR employs large screens and surround sound to deliver a panoramic experience, with learners wearing 3D glasses to enhance authenticity [17]. Immersive VR offers the highest level of presence, requiring head-mounted displays (HMDs) or data gloves to stimulate multiple senses and generate perceptions that closely approximate reality [18].
VR has been widely implemented in science education. For example, Chen et al. [19] developed a desktop VR system simulating Earth’s motion to help students understand astronomical concepts. Allison et al. [20] used HMDs to enable learners to explore gorilla habitats and observe social dominance interactions. Tarng et al. [21] applied VR to simulate light-speed space travel, aiding comprehension of physical concepts such as time dilation and length contraction in learning special relativity. Sun et al. [22] created a desktop VR model of the solar-lunar system to address misconceptions about astronomical phenomena, including day-night cycles, seasons, lunar phases, and solar eclipses.
Applications of Cardboard VR in science education have also been reported. For instance, Ou et al. [23] combined Cardboard VR with 360-degree panoramic views to promote inquiry-based learning in ecological conservation. Their findings demonstrated that immersive experiences not only enhanced learning performance but also fostered motivation, creativity, and critical thinking. These studies highlight VR’s potential to enhance science education by providing interactive and learner-centered experiences that actively engage students and foster deeper conceptual understanding.

2.2. Educational Applications of Drones

Over the past decade, the use of drones in education has grown rapidly, with schools, universities, and training centers adopting them as innovative tools to foster hands-on learning, student engagement, and interdisciplinary connections. Commonly known as unmanned aerial vehicles (UAVs), drones are aircraft that operate without an onboard pilot, controlled remotely or through automated flight systems. Equipped with high-resolution cameras, sensors, and GPS, modern drones perform tasks such as aerial photography, terrain mapping, environmental monitoring, agricultural spraying, and logistics. Their efficiency, versatility, and cost-effectiveness make drones valuable across civilian, industrial, military, disaster response, and research applications. These features also establish drones a powerful educational tool, enabling students to engage with authentic, real-world technology applications across interdisciplinary context.
Drones contribute to the study of history, culture, and the arts. Aerial imagery can document archaeological sites, cultural landscapes, and Indigenous territories, supporting heritage preservation and cultural education. Students in social studies courses can use drone-generated maps to analyze land use, settlement patterns, and cultural geography, while visual arts and media students benefit from the novel perspectives that drones provide for storytelling, film production, and photography. In vocational and professional training, drones are increasingly applied in construction management, surveying, logistics, public safety, and disaster management, helping students acquire skills aligned with industry demands and workforce expectations.
Despite their potential, educational drones are limited by high costs, the need for specialized training, and legal restrictions such as licensing and no-fly zones (NFZ) [24]. These barriers have led to virtual drone systems, which offer safe, cost-effective, and scalable alternatives. Delivered via simulation or VR, virtual drones allow students to practice navigation, mission planning, and aerial mapping without financial, safety, or regulatory constraints. Research shows that they enhance operational skills, situational awareness, and learning outcomes [25,26,27]. Combined with VR360 systems, they provide immersive learning that integrates ecological, cultural, and technological perspectives.
This study adopts an approach that integrates virtual drones with VR360 explorations of Indigenous communities and natural landscapes. Specifically, it focuses on the Amis people, Taiwan’s largest Indigenous group, whose cultural practices and ecological knowledge offer rich resources for science education. Amis society has distinctive rituals, such as the Harvest Festival, which serve as social structures and educational processes [28]. Indigenous science, transmitted orally across generations, embodies the accumulated wisdom of interactions with the natural environment [29]. Incorporating this cultural knowledge into science education helps bridge the gap between formal curricula and students’ lived experiences, making learning more relevant and meaningful [30].
Previous studies indicate that embedding Indigenous perspectives in science education enhances student understanding, motivation, and performance [31]. This study uses virtual drones and VR360 to explore Amis landscapes and local environments, integrating technological innovation with cultural heritage and science education. Students can navigate virtual aerial missions over tribal territories, observe environmental features, and analyze ecological species while engaging with the cultural narratives embedded in the landscape. This approach builds technical skills in drone operation and geospatial analysis, while fostering appreciation for Indigenous heritage and environmental stewardship. The safe, accessible virtual platform also overcomes the cost, training, and regulatory barriers that often limit the use of physical drones in schools.
Drones—both physical and virtual—can serve as powerful educational tools that transcend disciplinary boundaries, supporting STEM learning, environmental and cultural education, vocational training, and creative expression. Virtual drones facilitate access to drone technology, making it feasible for schools with limited resources or strict aviation regulations. This study combines virtual drones with VR360 representations of Amis culture and local environments to enhance scientific understanding and cultural awareness. The approach equips learners with 21st-century skills—problem-solving, collaboration, and technological literacy—while promoting educational equity by integrating Indigenous perspectives into formal science education.

2.3. Solar Trajectory Observation

Astronomy naturally inspires curiosity among elementary school students, yet its complex spatial concepts often present challenges for learners. Due to limited class time, teachers often assign astronomical observations as homework without sufficient guidance, which may lead students to complete tasks superficially and gradually lose interest. To resolve this issue, educators can adopt hands-on, inquiry-based strategies supported by teaching aids such as celestial models, images, videos, and computer simulations to convey accurate astronomical concepts. Experiential learning allows students to “learn by doing,” enhancing observational skills, experimental techniques, and active engagement with both teachers and the natural environment.
In elementary astronomy education, a common misconception is that winter is colder because the Sun is farther from Earth, and summer is hotter because it is closer. In fact, measurements show the opposite: during the Northern Hemisphere’s winter, Earth is slightly nearer to the sun, while in summer it is slightly farther away. The primary cause of the seasons is Earth’s revolution around the Sun, together with the 23.5° tilt of its rotational axis. This tilt changes the location of direct sunlight on Earth’s surface throughout the year. At the summer solstice, sunlight falls directly on the Tropic of Cancer; during the spring and autumn equinoxes, it strikes the equator; and at the winter solstice, it reaches the Tropic of Capricorn (Figure 1).
Earth’s rotation influences the apparent position of the Sun in the sky, which changes both throughout the day and across seasons. During the summer solstice in Taiwan, the Sun appears nearly overhead at noon, causing a person standing outdoors to cast almost no shadow. In contrast, around late December at noon, the Sun is about 45° south of overhead, and a person’s shadow is nearly as long as their height. The Sun’s apparent path also varies with latitude. At higher northern latitudes during the summer solstice, the Sun’s arc shifts farther north, and within the Arctic Circle, it remains above the horizon for 24 h, creating the phenomenon known as the “White Night.”
Because the Sun is dazzlingly bright and lacks clear reference points in the sky, ancient observers used vertical sticks to measure shadows, allowing them to track the Sun’s position and determine the four seasons and the 24 solar terms. In modern science education, solar observation activities often begin with flashlight models to demonstrate that shadows always appear on the side away from the light source. Students then record the direction of a stick’s shadow to infer the Sun’s position, which is 180° opposite the shadow, and measure its length and angle with a protractor to calculate solar altitude. Plotting these observations on a hemispherical model helps them identify the regular patterns of the Sun’s apparent motion across the sky (Figure 2).
Earth’s rotation causes day and night and creates the Sun’s apparent daily motion. Along the horizontal (east–west) axis, the Sun moves rapidly due to rotation, while vertical (north–south) changes related to the seasons occur more gradually. In classroom activities, students record the Sun’s position on a hemispherical model to determine azimuth and altitude angles, helping them recognize patterns of solar motion. Diagrams or animations of Earth’s orbit are often used to link the Sun’s position with seasonal temperature variations. In Taiwan, elementary astronomy lessons are scheduled in September or October, but typhoons, seasonal rains, and urban structures often impede observations. High summer temperatures further restrict outdoor activities. Ideally, students would track solar paths throughout all four seasons, yet practical constraints make this challenging. These limitations highlight the need for innovative teaching strategies and tools that provide accurate learning while sustaining student engagement in astronomy.

2.4. Situated Learning

Brown et al. [32] proposed situated learning theory, suggesting that educators design curricula using cognitive principles to help learners engage with cultural practices. In this framework, knowledge acts as a cognitive tool internalized through guided participation, collaborative problem-solving, and exploration in authentic contexts [33], thereby enhancing learning effectiveness It transforms learners from passive recipients into active constructors of knowledge. By emphasizing authentic environments, guided practice, and expert mentorship, situated learning supports knowledge construction, critical reflection, and the development of autonomous learning skills [34,35].
Young identified four core components of instructional design in situated learning environments [36]. First, learners engage with authentic exemplars while evaluating information from multiple perspectives [37]. Second, educators provide context-appropriate scaffolding in real-world settings to support complex cognitive tasks [38,39]. Third, teachers continuously assess progress, monitor information processing, and design contextualized activities to enhance skills [40]. Fourth, the effectiveness and relevance of situated learning interventions must be systematically evaluated and adapted to individual needs. Empirical research in science education emphasizes its practical efficacy. For example, Bursztyn et al. [41] implemented a situated learning approach augmented with virtual field simulations in geology education, which improved learner motivation, increased engagement, and lowered instructional costs.
Situated learning theory emphasizes authentic contexts, guided practice, and scaffolding to transform learners into active constructors of knowledge. Empirical evidence shows VR-supported situated learning enhances motivation, engagement, and cost-effectiveness in science education. Applying these principles, this study integrates VR360 and drone technology to develop a VR learning system that facilitates immersive, interactive learning, promotes cultural appreciation, and advances scientific understanding through autonomous, inquiry-based experiences.

2.5. Cognitive Load

Cognitive load theory describes the mental effort involved in learning [42]. Sweller emphasized that the limited capacity of working memory can impede learning when information processing is excessive [43]. Effective instructional design reduces extraneous load and supports transfer to long-term memory, for example, by segmenting complex content using visual aids to provide guided practice for increasing learner autonomy. Therefore, effective instructional design should minimize irrelevant mental effort while promoting the efficient transfer of information into long-term memory.
Cognitive load is typically divided into three categories. Intrinsic cognitive load is determined by the complexity of the material itself and varies with task difficulty [44]. Extraneous cognitive load arises from poor instructional design or irrelevant tasks, adding unnecessary mental effort that does not contribute to schema construction. Germane cognitive load refers to the mental resources devoted to schema building, which, although increasing cognitive demand, supports meaningful learning. Cognitive load can be assessed through multiple approaches. Subjective measures rely on learners’ self-reports, often using Likert-type scales. Physiological measures capture changes in learners’ biometric indicators, such as EEG signals [45], during tasks. Performance-based measures evaluate learners’ task outcomes to estimate the cognitive effort involved.
Recent research has examined cognitive load in technology-enhanced learning, particularly in virtual reality and augmented reality (AR) environments. Wu et al. [46] investigated learners’ cognitive load using portable Cardboard VR in science education. Huang et al. [47] combined AR and board games to explore programming instruction and its effects on cognitive load. Similarly, Armougum et al. simulated train station environments through VR storytelling tasks, while Andersen et al. analyzed cognitive load in VR training systems [48]. Together, these studies suggest that VR- and AR-based instruction generally does not impose excessive cognitive load on learners, emphasizing its potential to support engaging and effective learning experiences.

2.6. Learning Motivation

Motivation is crucial for shaping student engagement, learning outcomes, and long-term achievement. In the context of modern educational technologies, such as VR, AR, and drones, understanding the theoretical foundations of motivation is essential in leveraging these tools to enhance learning. This review examines key literature on motivation and its implications for applying VR and drones in education.
Stipek [49] emphasized that motivation is a dynamic construct influenced by instructional practices. She proposed that students’ motivation to engage with learning activities depends on their perceptions of ability, material relevance, and feedback received. In the case of VR and drones, these technologies provide opportunities for students to engage with learning content interactively, which can increase perceived relevance and challenge to foster motivation. For instance, the immersive environments created by VR technology make learning more engaging, encouraging deeper exploration and fostering a sense of competence, which is a key element in motivation.
Maehr and Meyer [50] further argued that motivation is context-dependent and influenced by both individual and social factors. Student motivation is shaped by their educational environment. VR and drones, offering personalized and authentic learning experiences, address diverse needs and integrate cultural and scientific knowledge, thereby fostering intrinsic motivation driven by curiosity and personal interest.
Ryan and Deci [51] introduced self-determination theory (SDT), which distinguishes between intrinsic and extrinsic motivation. According to SDT, fostering autonomy, competence, and relatedness is essential for enhancing intrinsic motivation. In virtual environments, students can explore content independently, supporting their motivation. For instance, interacting with VR simulations or controlling drones for scientific exploration can give students a sense of autonomy, which further drives their motivation.
In summary, the above literature demonstrates that motivation is essential for learning, particularly in technology-enhanced environments. By fostering autonomy, competence, and engagement through innovative tools such as VR and drones, educators can create motivating experiences that enhance learning outcomes.

3. Materials and Methods

This study integrates VR360 and drone technology to develop a VR system for improving effectiveness in learning Amis culture and science education. A quasi-experimental design was employed, with data collected through achievement tests and questionnaires. This section is organized into seven parts: Section 3.1 introduces the learning content of the VR system; Section 3.2 describes the scene models and virtual drone design; Section 3.3 presents the VR learning system; Section 3.4 illustrates virtual scene exploration with examples; Section 3.5 details the Sun path simulation model; Section 3.6 outlines the experimental design; and Section 3.7 presents the research instruments.

3.1. Learning Content

In this study, the learning content was designed based on Indigenous cultural and environmental resources of the Amis communities in Hualien, including the Kiwit and Jingpu tribes, the Tropic of Cancer Landmark, XinShe terraced fields, and the Xiouguluan River (Figure 3). To support exploratory learning, the VR system incorporated multiple observation points, each with interactive tasks integrating cultural, environmental, and astronomical knowledge (Table 1). The content was organized into three dimensions:
  • Amis Indigenous culture: traditional customs, festivals and artifacts
  • Local environments: Xiouguluan River and XinShe terraced fields
  • Solar observation: acquiring astronomical concepts through recording the Sun’s path

3.2. Scene Models and Virtual Drone Design

This section outlines the construction of physical scene models, the development of the virtual drone, and a description of its operation within the VR learning system.

3.2.1. Using a Physical Drone for Constructing Scene Models

In this study, aerial imagery of the Amis Tribe region was captured using a DJI Mavic 2 Pro drone (Da-Jiang Innovations, Shenzhen, China) with a high-resolution camera (Figure 4). Its flight paths were planned in DJI GS Pro 2.0, and the obtained images were processed with Bentley iTwin Capture Modeler 1.8 for 3D virtual scene creation. The survey area spanned approximately 1000 × 1000 m, with flight altitudes between 100 and 120 m above ground. To ensure comprehensive coverage of the whole observation region and key landmarks, a multi-angle circular flight strategy combining top-down and oblique imagery was employed. Adjacent images had 70–80% overlap, enhancing photogrammetry accuracy and the realism of reconstructed virtual scenes. In areas with significant terrain variation, multiple passes from different angles were required to capture the full complexity of the landscape, accurately reproducing the Amis Tribe and local environmental features.
This study employed an image-based Structure-from-Motion (SfM) algorithm combined with Multi-View Stereo (MVS) to construct a high-resolution 3D model of the Tropic of Cancer Landmark and neighboring regions. The SfM–MVS approach has been widely used for 3D modeling [52,53], as it can automatically estimate camera poses and reconstruct both sparse and dense 3D structures from multiple overlapping images without requiring a specific sequence, ultimately generating measurable point clouds and mesh models. The main steps of this method are described as follows:
  • Image acquisition: Images were captured around the landmark using a circular shooting strategy to ensure sufficient forward, backward, and lateral overlap, covering multiple viewing angles (Figure 5). This supports stable and reliable estimation of camera positions and orientations through aerial triangulation [54,55].
  • Feature detection and description: The first step in modeling involved the Scale-Invariant Feature Transform (SIFT) algorithm to detect and describe key points in the images, capturing features that are invariant across scale and rotation.
  • Aerotriangulation (AT): Corresponding feature points were used for AT to estimate the interior and exterior orientation parameters of each image, while simultaneously reconstructing the sparse 3D structure of the scene (Figure 6).
  • Dense reconstruction and model generation: Based on the estimated exterior orientations and sparse structure, MVS was applied to perform dense depth estimation, generating high-density point clouds. These were then meshed and textured to produce a high-precision 3D model (Figure 7).

3.2.2. Creating Virtual Drone and Controller Models

This study employed Blender 4.0 to create detailed 3D models of a virtual drone and its controller. The drone model was designed to resemble commercially available civilian drones, incorporating key components such as the fuselage, camera lens, propellers, and landing gear. Symmetrical elements were generated using Blender’s mirror function to optimize development efficiency. The controller was modeled with geometric primitives, such as cubes and cylinders, to represent buttons, joysticks, and shell recesses.
Texture maps for both the drone and controller were created using Adobe Photoshop 23.0 and applied within Blender to enhance visual realism (Figure 8). The completed 3D models were exported in the FBX format and imported into Unity Game Engine 2022.3.8f1, which supports cross-platform development for Windows, Mac OS, and multiplayer network applications. In Unity, C# programming was used to implement the user interface and operational functions of the virtual drone. Upon completion, the system could be compiled into an APK file for deployment on Android devices, enabling interactive, immersive experiences for drone operation and landscape observation.
The virtual drone and controller models were imported into the Unity game engine and integrated with the 3D terrain model, serving as interactive assets within the user interface and learning environment. This modeling process balanced realism, efficiency, and usability, providing a foundation for immersive educational applications such as drone operation training and observation of local environments (Figure 9). At the development stage, the virtual drone was designed and tested on a desktop computer, integrated with the virtual scenes and user interface for refinement, and subsequently exported to Android smartphones, enabling students to experience immersive interaction through Google Cardboard headsets within the VR360 learning system.

3.2.3. Virtual Drone Operation

Users can freely control the virtual drone to explore and observe the landscape, for example, the Tropic of Cancer Landmark. When the drone exceeds the preset altitude limit, the VR system automatically switches to a first-person perspective, displaying the flight path over the reconstructed 3D landscape model. This design not only provides an intuitive and immersive operational experience but also enhances the realism of landscape visualization, improving learners’ spatial perception, situational awareness, and overall learning outcomes. For example, Figure 10a shows the drone hovering in mid-air, along with the shadow of the landmark. Figure 10b demonstrates that moving the left joystick downward causes the drone to retreat. Figure 10c shows that moving the right joystick upward raises the drone, revealing the top of the landmark. Figure 10d illustrates that moving the right joystick to the right rotates the drone clockwise, providing a comprehensive view of a house on the ground and surrounding features.

3.3. VR Learning System

The VR learning system was developed for execution on smartphones using Cardboard headsets to explore Amis culture and local environments. From a VR360 panoramic perspective, learners can view real-world landscapes and interact with objects in the virtual space. By triggering actions such as rotation, observation, and information browsing, learners can engage with content to gain knowledge about Amis culture, local environments, and astronomical concepts such as the Sun and four seasons.
In the design process, panoramic images were captured using a 360-degree camera, while photographs and videos of local environments were taken with mobile devices. These resources were processed and integrated into the Unity game engine, where additional objects—including ecological images, texts, videos, 3D models, and UI/UX components—were embedded. Interactive feedback was provided via the Google VR SDK for Unity v1.200.0. The system was then exported to Android smartphones for use with Cardboard headsets, creating a lightweight and portable VR learning platform (Figure 11).
The VR system for exploring Amis culture and local environments was developed in a Microsoft Windows 10 environment using Visual Studio 2019, with programming and integration in C#. The system employed the Google VR SDK for Unity v1.200.0 (Figure 12), enabling trigger events and interactive functions for virtual objects. Figure 13 illustrates the system architecture, including the main themes and learning content. The virtual scene was created with Cinema 4D (C4D), images with Adobe Photoshop 23.0 and Illustrator 28.0, and videos edited using Adobe Premiere CS6 6.0.5. The development process involved:
  • Designing learning content focused on Amis culture and science education
  • Preparing photos, videos, and textual materials to support knowledge points
  • Developing virtual scenes and user interfaces within the VR learning system
  • Exporting the VR learning system and installing it on smartphones

3.4. System Operation

Prior to instruction, participants received a 15-min training session to ensure familiarity with the system operation. During the learning process, instructional prompts guided learners across five thematic modules, while worksheets were employed to record observations from each site. Interactive icons embedded within virtual scenes provided explanations of plants, animals, cultural features, or navigation cues (Figure 14). Task completion was tracked through status icons displayed in the interface, and the worksheets functioned as a structured review activity.
The VR learning system was designed based on the principles of situated learning. After entering the system, learners first accessed the main interface and then an exploration hall displaying the available observation sites and knowledge points (Figure 15). To complete the instructional sequence, learners were required to visit all sites, upon which the system unlocked assessment activities.
To enhance the authenticity of the situated learning experience, multiple interaction triggers were integrated into the system (Figure 16). Within each virtual scene, learners could freely navigate natural environments, access explanatory media, and complete observation tasks. This design seamlessly combined cultural and ecological knowledge with interactive VR experiences, thereby fostering engagement and supporting deeper learning. Moreover, the situated learning framework ensured that learners interacted with content in contextually meaningful ways, directly aligning with the study’s objectives of enhancing motivation, reducing cognitive load, and improving learning achievement.
The Xiuguluan River (Figure 17) originates from Xiuguluan Mountain in Taiwan’s Central Mountain Range and is the largest river in eastern Taiwan. It is also the only river that cuts through the Coastal Mountain Range. The section from Ruisui to Dagangkou stretches about 24 km, passing through more than twenty rapids and shoals, with majestic gorges and unique rock formations along the way, making it a popular destination for white-water rafting. It is one of Taiwan’s few unpolluted rivers, home to over 30 species of freshwater fish, among which the Formosan sucker (Acrossocheilus paradoxus), a protected species, is the most valuable. The lower valley preserves patches of tropical rainforest, where Formosan macaques are commonly seen, and more than 100 bird species have been recorded, reflecting the river’s rich biodiversity.
The Jingpu Tropic of Cancer Landmark (Figure 18), located in Fengbin Township, Hualien, is a significant geographical and cultural site marking the latitude of 23.5° N, where the Tropic of Cancer crosses Taiwan. Symbolizing the boundary between subtropical and tropical climate zones, the landmark provides educational value in both astronomy and geography. Designed as a sleek white tower, it highlights seasonal variations in solar altitude and serves as an ideal venue for observing the Sun’s trajectory, particularly during the summer solstice. The site attracts visitors for scientific exploration, cultural tourism, and its scenic coastal views. In the virtual scene, learners can observe the Tropic of Cancer Landmark rising prominently from the ground. By focusing on the designated information point, they can access a Sun path simulation model to explore variations in solar trajectories and the causes of the four seasons.

3.5. Sun Path Simulation Model

This study developed a Sun path simulation model to examine the solar trajectory within the virtual scene of the Jingpu Tropic of Cancer Landmark. The model integrates a virtual Sun synchronized with local time, a shadow-casting sundial for calculating solar azimuth and altitude angles, and a hemispherical sky model for recording the solar trajectory [56]. Based on the local position, date, and time, the model displays the Sun and according to its azimuth and altitude angles as well as the corresponding shadow of the sundial. For learners, the solar altitude angle can also be calculated using the cotangent of the sundial’s height divided by its shadow length. They can compute the Sun’s position by measuring the sundial’s shadow length and azimuth, and record the position, current date, and time on the hemispherical model. By adjusting observation times, students can explore how the Sun’s trajectory varies across the seasons (Figure 19).
The Sun path observation model was developed on a Windows 10 platform, using Java Development Kit (JDK) 21, Android Software Development Kit (SDK) 35.0, Eclipse IDE 4.34, and the Android Development Tools (ADT) 23.0.7 plug-in for Eclipse as the Android development environment. The Unity game engine was the primary development tool, with C# programming for system functionality. The ADT Bundle and Android SDK Tools 26.1.1 were installed to access mobile device sensors and display functionalities, supplementing Unity’s built-in capabilities. The model can provide timely and location-specific instructional content on the solar trajectory and transforms recorded data into organized and meaningful knowledge. For example, it illustrates the relationship between the Sun–Earth relative position and changes in solar azimuth and altitude, integrating observational data with spatial cognition to build accurate concepts of the solar trajectory.
Teachers can guide students to explore the relationship between the Sun and the causes of four seasons by explaining that Earth completes one revolution around the Sun in a year and that its axis is tilted at 23.5°. This tilt causes the position of direct sunlight on Earth’s surface to shift with the seasons. At the summer solstice, sunlight falls directly on the Tropic of Cancer; at the spring and autumn equinoxes, it strikes the equator; and at the winter solstice, it falls on the Tropic of Capricorn. Students can also learn from this model that Earth’s rotation produces the Sun’s apparent daily motion and is responsible for the alternation of day and night.

3.6. Experimental Design

This study adopted a quasi-experimental design to compare the learning effectiveness of the VR learning system with that of traditional teaching methods in the context of cultural and science education. Participants were assigned to either the experimental or control group. The experimental group received instruction using the VR learning system, whereas the control group was taught with PowerPoint slides and instructional videos. All instructional materials were designed by the researchers and delivered by the same teacher to ensure consistency in content and instructional style. Prior to instruction, both groups completed a pre-test to assess prior knowledge. The instructional process was divided into three sequential phases: preparation, instruction, and summary. During the preparation phase, both groups were introduced to solar-related knowledge through teacher explanations to stimulate learning motivation.
During the instructional phase, the two groups followed different approaches: the experimental group engaged in exploratory learning with the VR system and used worksheets to record observations, whereas the control group learned with PowerPoint slides and worksheets to reinforce understanding through observation and note-taking. After instruction, both groups completed a post-test to assess learning effectiveness. At the end of the experiment, all participants filled out learning motivation and cognitive load scales, and the experimental group additionally completed a system satisfaction questionnaire to evaluate their user experience with the VR learning system. The collected data were then analyzed to derive findings and draw research conclusions (Figure 20).
The VR group engaged in exploratory learning by navigating immersive 360° scenes and virtual drone operation to explore local cultural and ecological sites. They identified natural and Indigenous features, observed solar movements using the Sun path simulation model, calculated solar orientation and altitude angles, and recorded observations on worksheets. These tasks emphasized interactive exploration, spatial reasoning, and inquiry-based reflection. In contrast, the control group completed similar content-based tasks using PowerPoint slides and instructional videos without interactive components. Their activities focused on passive observation and note-taking rather than active manipulation, spatial analysis, or experiential engagement within a virtual environment.

3.7. Research Instruments

This study employed four research instruments to evaluate the effectiveness of different instructional approaches: an achievement test, a learning motivation scale, a cognitive load scale, and a system satisfaction questionnaire. The design rationale, item structure, and analytical methods for each instrument are described below.

3.7.1. Achievement Test

The achievement test was developed based on instructional content related to Amis culture and the local environment, including the solar observation activity. It comprised 16 multiple-choice questions, each with four options. To ensure both reliability and validity, the test items were reviewed by two science teachers with professional expertise. The test was administered before and after the instructional to allow pre–post comparison. For statistical analysis, this study adopted a one-way analysis of covariance (ANCOVA), using pre-test scores as the covariate, post-test scores as the dependent variable, and instructional methods as the independent variable, to examine whether different instructional approaches significantly affected students’ learning outcomes.

3.7.2. Learning Motivation Scale

To assess students’ motivation under different instructional modes, this study employed the learning motivation scale developed by Hwang, Yang, and Wang [57]. The scale used a five-point Likert format (1 = strongly disagree to 5 = strongly agree), with higher scores indicating greater motivation. It measured multiple dimensions, including learning interest, preference for challenge, and willingness to sustain effort. Data were analyzed using SPSS Statistics 25, calculating group means and conducting independent-samples t-tests to examine significant differences within groups.

3.7.3. Cognitive Load Scale

Cognitive load was assessed using the scale originally developed by Thomashow [58] and later revised by Hwang, Yang, and Wang [57]. The scale included eight items rated on a five-point Likert scale (1 = strongly disagree to 5 = strongly agree) and comprised two dimensions: mental load, assessing the difficulty of the learning content or task, and mental effort, evaluating the cognitive resources invested, with lower scores indicating lower cognitive load. Data were analyzed using SPSS Statistics 25 by calculating mean scores and performing independent-samples t-tests to determine whether the instructional methods produced significant differences in cognitive load.

3.7.4. System Satisfaction Questionnaire

The system satisfaction questionnaire was administered exclusively to the experimental group to assess their perceptions and user experiences with the VR learning system. Based on the Technology Acceptance Model (TAM) proposed by Davis [59], the questionnaire used a five-point Likert scale (1 = strongly disagree to 5 = strongly agree), with higher scores indicating greater acceptance and satisfaction. Descriptive statistics, including means and standard deviations, were calculated to evaluate perceived usefulness and ease of use, providing insights for future system improvements.

4. Results

The participants were 47 third-grade Indigenous students from an elementary school in Taoyuan County, Taiwan. Written informed consent was obtained from parents or legal guardians to ensure ethical compliance. Students were assigned to an experimental group (n = 26), which used the VR learning system, or a control group (n = 21), which received slide-based instruction (Figure 21). A pre-test confirmed no significant differences in prior knowledge between the two groups before the intervention. After instructional treatment using different approaches, both groups completed post-tests and scales to assess learning effectiveness, motivation, and cognitive load, with the experimental group additionally completing a system satisfaction questionnaire.

4.1. Learning Achievement

As shown in Table 2, the experimental group, which used the VR learning system to explore Amis culture and local environments, achieved a pre-test mean of 32.88 (SD = 11.85) and a post-test mean of 87.31 (SD = 6.20). The control group, which received slide-based instruction, scored 29.05 (SD = 9.57) on the pre-test and 63.57 (SD = 14.16) on the post-test. These results indicate that the experimental group not only outperformed the control group but also demonstrated greater improvement, with the smaller post-test standard deviation, suggesting a more consistent overall performance. Normality of data was verified using the Shapiro–Wilk test, and group differences between pre- and post-tests were analyzed using paired and independent-samples t-tests. The Shapiro–Wilk test confirmed that the data were normally distributed; therefore, parametric tests (paired and independent-samples t-tests) were applied for analysis of significant differences.
Paired-samples t-tests were conducted to examine within-group improvements in learning achievement following different instructional approaches. As shown in Table 3, the experimental group demonstrated a mean progress of 54.42 (SD = 13.29), t = 20.87, p < 0.001, indicating a highly significant improvement. The control group also exhibited a significant gain, with a mean increase of 34.52 (SD = 15.57), t = 10.16, p < 0.001. These results confirm that both instructional methods enhanced learning achievement, with the effect being stronger for the VR-based approach.
To further examine the effect of instructional methods, a one-way ANCOVA was conducted, using pre-test scores as the covariate, instructional methods as the independent variable, and post-test scores as the dependent variable. Prior to analysis, assumptions of normality, homogeneity, and linearity were tested. Levene’s test for homogeneity of variance was not significant (p = 0.383 > 0.05), confirming equal variances. As shown in Table 4, the ANCOVA revealed a significant effect of instructional methods on post-test scores, F = 54.64, p < 0.001, partial η2 = 0.56, indicating a strong effect size. These results demonstrate that the VR-based instructional method had a significantly greater positive impact on learning achievement compared with slide-based instruction.

4.2. Learning Motivation Results

According to the results in Table 5, the experimental group reported a mean learning motivation score of 4.24 (SD = 0.53), whereas the control group scored 3.49 (SD = 0.79). An independent-samples t-test revealed a significant difference between the two groups, t = 3.90, p < 0.001, indicating that students in the experimental group reflected significantly higher learning motivation after using the VR system to explore Amis culture and local environments compared with the control group.
As shown in Table 6, the experimental group consistently scored higher than the control group, with most items exceeding 4 on the five-point scale, corresponding to ratings between “satisfied” and “very satisfied.” These results suggest that the VR learning system enhanced learners’ motivation by providing an engaging experience through Cardboard VR and interactive virtual drone operation. Notably, the largest difference was observed in Item 1, “During this learning activity, I like challenging content since it helps me acquire new knowledge,” suggesting that VR instruction particularly enhanced engagement with challenging science content about the sun, which third graders often find difficult. In contrast, Item 6, “I want to perform well in this activity because demonstrating my abilities to family, teachers, and friends matters to me,” showed weaker significance. This suggests that learners’ extrinsic expectations—such as seeking approval from others—were less affected by the instructional method. Overall, the results indicate that the VR learning system, supplemented with worksheets on Amis culture and local environments, effectively increased students’ motivation and interest in learning.

4.3. Cognitive Load Results

The experimental group (n = 26) reported a mean score of 2.09 with a standard deviation of 0.63 in cognitive load, whereas the control group (n = 21) reported a mean score of 3.44 with a standard deviation of 0.88. The independent samples t-test achieved a significant difference between the two groups, t = −6.152, p < 0.001, indicating that students in the experimental group experienced significantly lower cognitive load after receiving immersive instruction with the VR learning system (Table 7).
As shown in Table 8, the mean scores of the experimental group were consistently lower than those of the control group, with most items scoring below 2, falling between “strongly disagree” and “disagree.” However, Items 5, 6, 7, and 8 scored above 2, indicating responses between “neutral” and “disagree.” Analysis of these items provides further insights. Item 5 and Item 8 suggest that some students found it challenging to operate the VR system and keep pace with the lesson at the same time, which occasionally prevented them from viewing all available information in the virtual scene. Some students became deeply immersed in the VR environment, focusing on observation to the extent that they lagged behind in answering questions or completing worksheets. Item 6 indicates that, while third-grade students engaged with science content on solar observation, some perceived the material as relatively difficult. Item 7 suggests that unfamiliarity with the operation of Cardboard VR also contributed to increased effort, as students needed time to adapt to the hardware before fully focusing on learning.

4.4. System Satisfaction

As shown in Table 9, most items on the system satisfaction questionnaire exceeded a score of 4, indicating that learners’ satisfaction ranged between “satisfied” and “very satisfied.” The overall mean score of the system satisfaction questionnaire was 4.31, reflecting a high level of satisfaction with the VR system for exploring Amis culture and local environments. Among the subscales, “Perceived Usefulness” had a mean score of 4.40, while “Perceived Ease of Use” scored 4.23. One item in the latter subscale, Item 8, “I was able to master the use of the system in a short time,” scored slightly below 4, suggesting that some learners initially found the system unfamiliar and required more time to become proficient. This finding indicates that providing additional training before learning could help minimize initial difficulties and further enhance the overall user experience.

5. Discussion

This study integrates VR360 and drone technology to develop a VR learning system for elementary science education on Amis culture and local environments. Learning achievement results confirmed the effectiveness of VR-based instruction, with the experimental group showing substantially greater post-test gains. The integration of immersive exploration with structured worksheets likely facilitated deeper processing by enabling embodied interaction with abstract concepts, thereby enhancing understanding through sensorimotor grounding and constructivist learning processes.
The VR learning system significantly enhanced student motivation compared with traditional slide-based instruction (M = 4.24 vs. 3.49, Cohen’s d = 1.16), particularly by stimulating intrinsic interest and willingness to engage with challenging tasks. This effect suggests that the VR system reduces perceived difficulty through multimodal feedback by minimizing abstract barriers, thereby reinforcing the competence dimension of Self-Determination Theory. In contrast, improvements in extrinsic motivation were less pronounced, indicating that the VR system primarily strengthens learners’ intrinsic drive rather than externally driven expectations. This distinction suggests the requirement for fostering sustained engagement and effective knowledge transfer.
Contrary to common concerns about cognitive overload, the experimental group reported reduced mental effort, supporting Cognitive Load Theory when multimedia principles are effectively applied. The VR system appears to alleviate extraneous load through coherent presentation and reduce intrinsic load for spatial concepts via 3D visualization, thereby enabling deeper processing. This finding reinforces the perspective that immersive environments can reduce cognitive burden by providing contextualized and intuitive representations. Minor challenges related to time constraints and device familiarity were noted, reflecting typical patterns of technology adoption. However, these issues were exceeded by sustained cognitive benefits, emphasizing the importance of adequate orientation and practice prior to full classroom implementation.
Learners reported high satisfaction with the VR learning system (M = 4.31), particularly emphasizing its usefulness (M = 4.40) and ease of use (M = 4.23), thereby validating the Technology Acceptance Model and supporting its sustainable adoption. Although some students required additional guidance in mastering the VR device, as reflected in lower ratings for acquisition speed (M = 3.96), the overall positive feedback demonstrates the feasibility of integrating VR and drone technology into elementary science education and emphasizes the importance of providing adequate training guidelines.
The findings suggest that the VR system enhances learning through multiple mechanisms, including increased intrinsic motivation, effective cognitive load management, and embodied engagement with content. These results offer evidence-based guidance for instructional design by emphasizing established learning principles rather than relying solely on technological novelty. However, this study has several limitations: its short duration precludes assessment of long-term retention, the small sample size restricts generalizability, and individual differences were not examined.

6. Conclusions and Future Work

This study integrated VR360 and drone technology to develop a VR system for teaching Amis culture and local environments within elementary natural science curricula. The results revealed that, when combined with task-based worksheets and observational activities, the VR system significantly enhanced students’ learning motivation, reduced cognitive load, and improved learning achievement compared with slide-based instruction. Learners also reported high satisfaction, indicating that the system enhanced their understanding of cultural and astronomical concepts while fostering curiosity and engagement. Nonetheless, some usability challenges related to device familiarity were identified, suggesting the need for further training or design refinements.
Future research could proceed in several directions. One promising approach is to explore the system’s cultural scalability by adapting it for other Indigenous communities in Taiwan and beyond, thereby evaluating its effectiveness across diverse cultural and educational contexts. Another primary direction is to enrich the system with advanced content formats, including interactive 3D celestial models, dynamic simulations, and augmented-reality overlays, to enhance immersion and provide more authentic, engaging learning experiences. Longitudinal studies are also needed to assess knowledge retention, learning transfer, and the sustained impact of VR on motivation and engagement. Furthermore, investigating individual differences—such as prior knowledge, spatial ability, and learning styles—is essential for refining the system to support differentiated instruction and maximize its effectiveness for diverse learners.
In addition, comparative studies could examine the system’s effectiveness relative to other immersive technologies, such as augmented reality, mixed reality, or desk-top VR, to clarify its unique strengths and limitations. From a practical perspective, research on classroom management strategies, teacher training models, and large-scale deployment is also needed to support real-world adoption. Finally, given the affordability and potential of Cardboard VR and virtual drones in education, the VR system could be applied to informal learning, cultural preservation, tourism, and professional training, thereby extending its impact beyond the classroom.
In Taiwan, Indigenous students represent a small population in elementary schools, limiting participant recruitment and the broader applicability of findings. Subsequent studies could include multiple schools across regions, increasing participant numbers and diversity, to enhance representativeness and validate the VR learning system’s effectiveness across broader Indigenous and non-Indigenous student populations.
Future work should explore the sustained effects of VR instruction across diverse populations, evaluate cost-effectiveness, and investigate improvements in scalability and usability to realize its educational potential and support sustainable classroom implementation. In addition, the research could actively involve Amis community members in co-designing VR content, providing feedback, and guiding cultural interpretation to ensure authentic representation and avoid one-sided perspectives.

Author Contributions

Conceptualization, Kuo-Liang Ou; methodology, Wernhuar Tarng; software, Tsu-Jen Ding; formal analysis, Yu-Jung Wu; writing—original draft preparation, Yu-Jung Wu; writing—review and editing, Jen-Chu Hsu. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Science and Technology Council (NSTC), Taiwan under the grant number 112-2410-H-007-043-MY2.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Research Ethics Committee of National Tsing Hua University, Taiwan (REC No. 11112HT130, 10 February 2022).

Informed Consent Statement

Written informed consent has been obtained from the participant(s) to publish this paper.

Data Availability Statement

Data available on request due to restrictions.

Conflicts of Interest

Author Jen-Chu Hsu is employed by the company Coretronic Corporation. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Peng, L.H.; Chung, L.C. A Study of Forming Taiwanese Aborigines’ Totem and Conversational Products. In Proceedings of the 2017 International Conference on Applied System Innovation (ICASI), Sapporo, Japan, 13–17 May 2017; pp. 13–17. [Google Scholar]
  2. Ministry of Education (MOE), Taiwan. National Academy for Educational Research. 2021. Available online: https://english.moe.gov.tw/cp-4-19392-f3b75-1.html (accessed on 15 September 2025).
  3. Hollingworth, R.W.; McLoughlin, C. Developing Science Students’ Metacognitive Problem Solving Skills Online. Australas. J. Educ. Technol. 2001, 17, 50–63. [Google Scholar] [CrossRef]
  4. Armougum, A.; Orriols, E.; Gaston-Bellegarde, A.; Marle, C.J.-L.; Piolino, P. Virtual Reality: A New Method to Investigate Cognitive Load during Navigation. J. Environ. Psychol. 2019, 65, 101338. [Google Scholar] [CrossRef]
  5. Araiza-Alba, P.; Keane, T.; Matthews, B.; Simpson, K.; Strugnell, G.; Chen, W.S.; Kaufman, J. The Potential of 360-Degree Virtual Reality Videos to Teach Water-Safety Skills to Children. Comput. Educ. 2021, 163, 104096. [Google Scholar] [CrossRef]
  6. Ou, K.L.; Liu, Y.; Tarng, W. Development of a Virtual Ecological Environment for Learning the Taipei Tree Frog. Sustainability 2021, 13, 5911. [Google Scholar] [CrossRef]
  7. Karkar, A.; Salahuddin, T.; Almaadeed, N.; Aljaam, J.M.; Halabi, O. A Virtual Reality Nutrition Awareness Learning System for Children. In Proceedings of the 2018 IEEE Conference on e-Learning, e-Management and e-Services (IC3e), Langkawi, Malaysia, 21–22 November 2018. [Google Scholar]
  8. Shakhatreh, H.; Sawalmeh, A.H.; Al-Fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
  9. Colomina, I.; Molina, P. Unmanned Aerial Systems for Photogrammetry and Remote Sensing: A Review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef]
  10. Floreano, D.; Wood, R.J. Science, Technology and the Future of Small Autonomous Drones. Nature 2015, 521, 460–466. [Google Scholar] [CrossRef]
  11. Zeng, Y.; Zhang, R.; Lim, T.J. Wireless Communications with Unmanned Aerial Vehicles: Opportunities and Challenges. IEEE Commun. Mag. 2016, 54, 36–42. [Google Scholar] [CrossRef]
  12. Valavanis, K.P.; Vachtsevanos, G.J. (Eds.) Handbook of Unmanned Aerial Vehicles; Springer: Dordrecht, The Netherlands, 2015. [Google Scholar] [CrossRef]
  13. Chittaro, L.; Buttussi, F. Assessing Knowledge Retention of an Immersive Serious Game vs. a Traditional Education Method in Aviation Safety. IEEE Trans. Vis. Comput. Graph. 2015, 21, 529–538. [Google Scholar] [CrossRef]
  14. Pan, Z.; Cheok, A.D.; Yang, H.; Zhu, J.; Shi, J. Virtual Reality and Mixed Reality for Virtual Learning Environments. Comput. Graph. 2006, 30, 20–28. [Google Scholar] [CrossRef]
  15. Ausburn, L.J.; Ausburn, F.B. Desktop Virtual Reality: A Powerful New Technology for Teaching and Research in Industrial Teacher Education. J. Ind. Teach. Educ. 2004, 41, 1–16. [Google Scholar]
  16. Grabowski, A.; Jankowski, J. Virtual Reality-Based Pilot Training for Underground Coal Miners. Saf. Sci. 2015, 72, 310–314. [Google Scholar] [CrossRef]
  17. Muhanna, M.A. Virtual Reality and the CAVE: Taxonomy, Interaction Challenges and Research Directions. J. King Saud Univ. Comput. Inf. Sci. 2015, 27, 344–361. [Google Scholar] [CrossRef]
  18. Caserman, P.; Zhang, H.; Zinnäcker, J.; Göbel, S. Development of a Directed Teleport Function for Immersive Training in Virtual Reality. In Proceedings of the 2019 11th International Conference on Virtual Worlds Games Serious Applications (VS-Games), Vienna, Austria, 4–6 September 2019. [Google Scholar]
  19. Chen, C.H.; Yang, J.C.; Shen, S.; Jeng, M.C. A Desktop Virtual Reality Earth Motion System in Astronomy Education. J. Educ. Technol. Soc. 2007, 10, 289–304. Available online: https://www.learntechlib.org/p/75400/ (accessed on 15 September 2025).
  20. Allison, D.; Wills, B.; Bowman, D.; Wineman, J.; Hodges, L.F. The Virtual Reality Gorilla Exhibit. IEEE Comput. Graph. Appl. 1997, 17, 30–38. [Google Scholar] [CrossRef]
  21. Tarng, W.; Pan, I.-C.; Ou, K.-L. Effectiveness of Virtual Reality on Attention Training for Elementary School Students. Systems 2022, 10, 104. [Google Scholar] [CrossRef]
  22. Sun, K.-T.; Lin, C.-L.; Wang, S.-M. A 3-D Virtual Reality Model of the Sun and the Moon for E-Learning at Elementary Schools. Int. J. Sci. Math. Educ. 2010, 8, 689–710. [Google Scholar] [CrossRef]
  23. Ou, K.-L.; Chu, S.-T.; Tarng, W. Development of a Virtual Wetland Ecological System Using VR 360° Panoramic Technology for Environmental Education. Land 2021, 10, 829. [Google Scholar] [CrossRef]
  24. Ng, W.; Cheng, G. Integrating Drone Technology in STEM Education: A Case Study to Assess Teachers’ Readiness and Training Needs. Issues Inform. Sci. Inf. Technol. 2019, 16, 61–70. [Google Scholar] [CrossRef]
  25. Albeaino, G.; Eiris, R.; Gheisari, M.; Issa, R. DroneSim: A VR-Based Flight Training Simulator for Drone-Mediated Building Inspections. Constr. Innov. 2021, 22, 831–848. [Google Scholar] [CrossRef]
  26. Ruiz, R.J.; Saravia, J.L.; Andaluz, V.H.; Sánchez, J.S. Virtual Training System for Unmanned Aerial Vehicle Control Teaching–Learning Processes. Electronics 2022, 11, 2613. [Google Scholar] [CrossRef]
  27. Loquercio, A.; Kaufmann, E.; Ranftl, R.; Dosovitskiy, A.; Koltun, V.; Scaramuzza, D. Deep Drone Racing: From Simulation to Reality with Domain Randomization. arXiv 2019, arXiv:1905.09727. [Google Scholar] [CrossRef]
  28. Lee, H.; Yen, C.F.; Aikenhead, G.S. Indigenous Elementary Students’ Science Instruction in Taiwan: Indigenous Knowledge and Western Science. Res. Sci. Educ. 2012, 42, 1183–1199. [Google Scholar] [CrossRef]
  29. Snively, G.; Corsiglia, J. Discovering Indigenous Science: Implications for Science Education. Sci. Educ. 2001, 85, 6–34. [Google Scholar] [CrossRef]
  30. Cobern, W.W. Constructivism and Non-Western Science Education Research. Int. J. Sci. Educ. 1996, 18, 295–310. [Google Scholar] [CrossRef]
  31. Lee, Y.L.; Chen, A.C.; Wang, K.L.; Chen, C.M.; Hsu, C.Y.; Hwa, K.; Rau, H.H.; Chen, Y.C.; Chiu, H.W.; Yao, C.J. Establishing a Cloud-Based Indigenous Elementary School E-Learning System Assimilating into Indigenous Culture for Health Science Education—South-Paiwan Tribe Experience. In Proceedings of the 2016 International Conference on Platform Technology and Service (PlatCon), Jeju, Republic of Korea, 15–17 February 2016. [Google Scholar]
  32. Brown, J.S.; Collins, A.; Duguid, P. Situated Cognition and the Culture of Learning. Educ. Res. 1989, 18, 32–42. [Google Scholar] [CrossRef]
  33. Ma, F.T.; Yeh, M.Y. The Development of the Situated Learning Materials in the Nursing and Communication Course. In Proceedings of the 2015 8th International Conference on Ubi-Media Computing (UMEDIA), Colombo, Sri Lanka, 24–26 August 2015. [Google Scholar]
  34. Gonen, A.; Lev-Ari, L.; Sharon, D.; Amzalag, M. Situated Learning: The Feasibility of an Experimental Learning of Information Technology for Academic Nursing Students. Cogent Educ. 2016, 3, 1154260. [Google Scholar] [CrossRef]
  35. Lai, A.F.; Guo, Y.S.; Chen, Y.H. Developing Immersion Virtual Reality for Supporting the Students to Learn Concepts of Starry Sky Unit. In Proceedings of the 2020 IEEE International Conference on Consumer Electronics, Taiwan (ICCE-Taiwan), Taoyuan City, Taiwan, 28–30 September 2020. [Google Scholar]
  36. Young, M.F. Instructional Design for Situated Learning. Educ. Technol. Res. Dev. 1993, 41, 43–58. [Google Scholar] [CrossRef]
  37. Shaw, R. Ecological Psychology: The Consequence of a Commitment to Realism. In Cognition and the Symbolic Processes; Routledge: London, UK, 2024; pp. 81–104. [Google Scholar]
  38. Bruner, J. Possible Worlds; Harvard University Press: Cambridge, MA, USA, 1986. [Google Scholar]
  39. Gresalfi, M.S.; Rittle-Johnson, B.; Loehr, A.; Nichols, I. Design Matters: Explorations of Content and Design in Fraction Games. Educ. Technol. Res. Dev. 2018, 66, 579–596. [Google Scholar] [CrossRef]
  40. Collins, A.M. The Role of Computer Technology in Restructuring Schools. Phi Delta Kappan 1991, 73, 28–36. [Google Scholar]
  41. Bursztyn, N.; Pederson, J.; Shelton, B.; Walker, A.; Campbell, T. Utilizing Geo-Referenced Mobile Game Technology for Universally Accessible Virtual Geology Field Trips. Int. J. Educ. Math. Sci. Technol. 2015, 3, 93–100. [Google Scholar] [CrossRef]
  42. Paas, F.; Renkl, A.; Sweller, J. Cognitive Load Theory and Instructional Design: Recent Developments. Educ. Psychol. 2003, 38, 1–4. [Google Scholar] [CrossRef]
  43. Sweller, J. Cognitive Load during Problem Solving: Effects on Learning. Cogn. Sci. 1988, 12, 257–285. [Google Scholar] [CrossRef]
  44. Kalyuga, S. Cognitive Load Theory: How Many Types of Load Does It Really Need? Educ. Psychol. Rev. 2011, 23, 1–19. [Google Scholar] [CrossRef]
  45. Yoo, G.; Kim, H.; Hong, S. Prediction of Cognitive Load from Electroencephalography Signals Using Long Short-Term Memory Network. Bioengineering 2023, 10, 361. [Google Scholar] [CrossRef]
  46. Wu, W.-L.; Hsu, Y.; Yang, Q.-F.; Chen, J.-J. A Spherical Video-Based Immersive Virtual Reality Learning System to Support Landscape Architecture Students’ Learning Performance during the COVID-19 Era. Land 2021, 10, 561. [Google Scholar] [CrossRef]
  47. Huang, S.-Y.; Tarng, W.; Ou, K.-L. Effectiveness of AR Board Game on Computational Thinking and Programming Skills for Elementary School Students. Systems 2023, 11, 25. [Google Scholar] [CrossRef]
  48. Andersen, S.A.; Mikkelsen, P.T.; Konge, L.; Cayé-Thomasen, P.; Sørensen, M.S. Cognitive Load in Mastoidectomy Skills Training: Virtual Reality Simulation and Traditional Dissection Compared. J. Surg. Educ. 2016, 73, 45–50. [Google Scholar] [CrossRef]
  49. Stipek, D. Motivation and Instruction. In Handbook of Educational Psychology; Berliner, D., Calfee, R., Eds.; Macmillan: New York, NY, USA, 1995; pp. 85–113. [Google Scholar]
  50. Maehr, M.L.; Meyer, H.A. Understanding Motivation and Schooling. Educ. Psychol. Rev. 1997, 9, 371–409. [Google Scholar] [CrossRef]
  51. Ryan, R.M.; Deci, E.L. Intrinsic and Extrinsic Motivations. Contemp. Educ. Psychol. 2000, 25, 54–67. [Google Scholar] [CrossRef]
  52. James, M.R.; Robson, S. Straightforward reconstruction of 3D surfaces and topography with a camera: Accuracy and geoscience application. J. Geophys. Res. Earth Surf. 2012, 117, F03028. [Google Scholar] [CrossRef]
  53. Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  54. Schonberger, J.L.; Frahm, J.-M. Structure-from-motion revisited. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 4104–4113. [Google Scholar]
  55. Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J.M. ‘Structure-from-Motion’ photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 2012, 179, 300–314. [Google Scholar] [CrossRef]
  56. Tarng, W.; Ou, K.-L.; Lu, Y.-C.; Shih, Y.-S.; Liou, H.-H. A Sun Path Observation System Based on Augmented Reality and Mobile Learning. Mob. Inf. Syst. 2018, 2018, 5950732. [Google Scholar] [CrossRef]
  57. Hwang, G.-J.; Yang, L.-H.; Wang, S.-Y. A Concept Map-Embedded Educational Computer Game for Improving Students’ Learning Performance in Natural Science Courses. Comput. Educ. 2013, 69, 121–130. [Google Scholar] [CrossRef]
  58. Thomashow, M. Bringing the Biosphere Home: Learning to Perceive Global Environmental Change; MIT Press: Cambridge, MA, USA, 2001. [Google Scholar]
  59. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User Acceptance of Computer Technology: A Comparison of Two Theoretical Models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef]
Figure 1. The causes of four seasons: Earth’s tilt and orbit around the Sun.
Figure 1. The causes of four seasons: Earth’s tilt and orbit around the Sun.
Ijgi 14 00441 g001
Figure 2. Observing the Sun’s motion using stick shadows.
Figure 2. Observing the Sun’s motion using stick shadows.
Ijgi 14 00441 g002
Figure 3. Observation points and their locations on Google Maps.
Figure 3. Observation points and their locations on Google Maps.
Ijgi 14 00441 g003
Figure 4. Aerial imagery of the Amis Tribe using a DJI Mavic 2 Pro drone.
Figure 4. Aerial imagery of the Amis Tribe using a DJI Mavic 2 Pro drone.
Ijgi 14 00441 g004
Figure 5. On-site drone photography of the Tropic of Cancer Landmark.
Figure 5. On-site drone photography of the Tropic of Cancer Landmark.
Ijgi 14 00441 g005
Figure 6. Results of aerotriangulation (AT) for sparse 3D scene reconstruction.
Figure 6. Results of aerotriangulation (AT) for sparse 3D scene reconstruction.
Ijgi 14 00441 g006
Figure 7. Results of dense reconstruction and 3D model generation.
Figure 7. Results of dense reconstruction and 3D model generation.
Ijgi 14 00441 g007
Figure 8. Designing 3D models of a virtual drone and its controller.
Figure 8. Designing 3D models of a virtual drone and its controller.
Ijgi 14 00441 g008
Figure 9. Integration of the virtual drone and 3D terrain models in Unity game engine.
Figure 9. Integration of the virtual drone and 3D terrain models in Unity game engine.
Ijgi 14 00441 g009
Figure 10. Exploration and observation of the Tropic of Cancer Landmark using the virtual drone: (a) hovering, (b) retreating, (c) ascending, and (d) rotating clockwise.
Figure 10. Exploration and observation of the Tropic of Cancer Landmark using the virtual drone: (a) hovering, (b) retreating, (c) ascending, and (d) rotating clockwise.
Ijgi 14 00441 g010
Figure 11. Development process of the VR learning system.
Figure 11. Development process of the VR learning system.
Ijgi 14 00441 g011
Figure 12. Developing virtual scenes using the Unity game engine.
Figure 12. Developing virtual scenes using the Unity game engine.
Ijgi 14 00441 g012
Figure 13. Main themes and virtual scenes of the VR learning system.
Figure 13. Main themes and virtual scenes of the VR learning system.
Ijgi 14 00441 g013
Figure 14. Instructional prompts and interactive icons within the virtual scene.
Figure 14. Instructional prompts and interactive icons within the virtual scene.
Ijgi 14 00441 g014
Figure 15. VR system interface showing observation sites and associated knowledge points.
Figure 15. VR system interface showing observation sites and associated knowledge points.
Ijgi 14 00441 g015
Figure 16. Observation point presenting Amis singing and dancing activities.
Figure 16. Observation point presenting Amis singing and dancing activities.
Ijgi 14 00441 g016
Figure 17. Virtual scene illustrating the ecological environment of the Xiuguluan River.
Figure 17. Virtual scene illustrating the ecological environment of the Xiuguluan River.
Ijgi 14 00441 g017
Figure 18. Virtual scene of the Jingpu Tropic of Cancer Landmark.
Figure 18. Virtual scene of the Jingpu Tropic of Cancer Landmark.
Ijgi 14 00441 g018
Figure 19. Three-dimensional model of the Tropic of Cancer Landmark for observing the solar trajectory.
Figure 19. Three-dimensional model of the Tropic of Cancer Landmark for observing the solar trajectory.
Ijgi 14 00441 g019
Figure 20. Flowchart of the teaching experiment for both groups.
Figure 20. Flowchart of the teaching experiment for both groups.
Ijgi 14 00441 g020
Figure 21. Experimental group (left): exploratory learning with the VR system; control group (right): traditional instruction with PowerPoint slides and worksheets.
Figure 21. Experimental group (left): exploratory learning with the VR system; control group (right): traditional instruction with PowerPoint slides and worksheets.
Ijgi 14 00441 g021
Table 1. Virtual scenes, categories, and content descriptions in the VR learning system.
Table 1. Virtual scenes, categories, and content descriptions in the VR learning system.
Virtual SceneCategoryContent Description
Jingpu TribeAttractionThe Jingpu Tribe is situated in Fengbin, Hualien County, at the mouth of the Xiuguluan River. Known for its rich coastal ecology and abundant fishing grounds, the tribe’s primary industry is fishing.
Sun TribeTo the east lies the Pacific Ocean, making the Jingpu community closest to the sunrise. A prominent rock within the tribe, known as the “Sun Stone” by the elders, has given the community its alternative name, the Sun Tribe.
Tropic of Cancer LandmarkAttractionThe Tropic of Cancer lies at 23.5° N latitude, marking the northern boundary of the Sun’s direct rays on Earth. The Jingpu Tropic of Cancer Landmark has a narrow slit; at noon on the summer solstice, sunlight shines directly through it, allowing users to experience the phenomenon of no shadow under the Sun.
Natural
Ecology
Facing the Pacific Ocean to the east, Jingpu is the Amis community closest to the sunrise. A notable rock within the tribe, called the ‘Sun Stone’ by the elders, has inspired the community’s alternative name, the Sun Tribe.
Xiouguluan RiverAttractionOriginating from Xiouguluan Mountain, the Xiouguluan River stretches 103 km, with abundant water year-round, making it the largest river in eastern Taiwan. Its winding course passes through Ruisui and cuts across the Coastal Mountain Range. From May to October during the high-water season, visitors can enjoy the thrill of river rafting.
Natural
Ecology
The Xiouguluan River is one of Taiwan’s few unpolluted rivers, nurturing over 30 species of freshwater fish, including the rare Formosan Shoveljaw Carp (Varicorhinus alticorpus), a protected species. Along its lower gorges, tropical rainforests remain intact. Among mammals, Formosan macaques are most commonly seen, and more than 100 bird species have been recorded, reflecting the river’s rich biodiversity.
Amis Harvest
Festival
Cultural
Features
The Harvest Festival is held after the millet harvest, when the Amis people give thanks to the spirits. With its grand scale and multifaceted activities, the festival embodies diverse cultural characteristics.
FestivalsAlthough named for harvest, the festival includes thanksgiving, social gatherings, inter-age group promotion ceremonies, and military training inspections. It is a comprehensive event encompassing economic, religious, social, political, and cultural aspects.
Traditional ArtsHandicraftsThe Amis people use plants from the natural environment to create household utensils through handicraft skills. Wooden items include musical drums, clappers, and spoons; bamboo items include water containers, bird scarers, and bamboo cannons; rattan is woven into fish baskets; shell ginger leaves and hollow grasses are woven into mats; and Amis pottery is also renowned.
Song and Dance
Performance
The Amis are not only naturally fond of singing but also passionate about dancing. During performances, they wear brightly colored costumes, with bronze bells tied to wrists and ankles. The rhythm of the bells combines with dance steps and songs, creating performances that are both graceful and spectacular—one of the most distinctive features of Amis culture.
Table 2. Descriptive statistics of the pre-test and post-test scores for both groups.
Table 2. Descriptive statistics of the pre-test and post-test scores for both groups.
GroupPre-TestPost-Test
MeanSDMeanSD
Experimental Group (n = 26)32.8811.8587.316.20
Control Group (n = 21)29.059.5763.5714.16
Table 3. Results of paired samples t-tests in learning achievement for both groups.
Table 3. Results of paired samples t-tests in learning achievement for both groups.
GroupNumberMeanSDt-Test
Experimental Group2654.4213.2920.87 ***
Control Group2134.5215.5710.16 ***
p < 0.001 ***.
Table 4. One-way ANCOVA results of learning effectiveness for the two groups.
Table 4. One-way ANCOVA results of learning effectiveness for the two groups.
SourceSSdfFpη2
Pre-test Score453.8114.040.0510.086
Group6142.64154.640.000 ***0.56
Error4917.3944
Total28802547
p < 0.001 ***.
Table 5. Independent samples t-test results of learning motivation for both groups.
Table 5. Independent samples t-test results of learning motivation for both groups.
GroupNumberMeanSDt-Test
Experimental Group264.240.533.90 ***
Control Group213.490.79
p < 0.001 ***.
Table 6. Descriptive statistics of learning motivation for the two groups.
Table 6. Descriptive statistics of learning motivation for the two groups.
ItemExperimental GroupControl Groupp
MeanSDMeanSD
1.
During this learning activity, I like challenging content since it helps me acquire new knowledge.
4.500.913.620.870.001 ***
2.
I prefer materials that stimulate my curiosity in this activity, even when they are demanding.
4.120.863.431.120.022 *
3.
If given the choice, I would select courses that allow me to learn, even if it affects my grades.
4.190.753.291.520.011 *
4.
Earning a good grade in this activity gives me the greatest sense of satisfaction.
4.310.883.621.320.038 *
5.
Whenever possible, I aim to achieve higher grades in this activity compared to my peers.
4.190.853.481.120.017 *
6.
I want to perform well in this activity because demonstrating my abilities to family, teachers, and friends matters to me.
4.150.833.521.440.066
p < 0.05 *, p < 0.001 ***.
Table 7. Independent sample t-test results of cognitive load for both groups.
Table 7. Independent sample t-test results of cognitive load for both groups.
GroupNumberMeanSDt-Test
Experimental Group262.090.63−6.152 ***
Control Group213.440.88
p < 0.001 ***.
Table 8. Descriptive statistics of cognitive load for the two groups.
Table 8. Descriptive statistics of cognitive load for the two groups.
DimensionItemExperimental GroupControl Groupp
MeanSDMeanSD
Mental Load
1.
The content presented in this activity was difficult for me to understand.
1.881.183.901.140.000 ***
2.
Considerable effort was required to respond to the questions in this activity.
1.961.043.291.270.000 ***
3.
Answering the questions in this activity was a demanding task for me.
1.880.953.241.180.000 ***
4.
I experienced a sense of frustration when attempting to answer the questions.
1.850.883.191.370.000 ***
5.
I did not have enough time to answer the questions in this learning activity.
2.121.283.001.180.019 *
Mental
Effort
6.
The way the content and instruction were delivered required a high level of mental effort.
2.231.183.861.280.000 ***
7.
Completing the tasks to meet the learning objectives demanded substantial effort from me.
2.731.343.431.360.085
8.
The instructional strategy used in this activity was not easy for me to follow or grasp.
2.041.153.671.110.000 ***
p < 0.05 *, p < 0.001 ***.
Table 9. Analysis results of the mean and standard deviation on the system satisfaction.
Table 9. Analysis results of the mean and standard deviation on the system satisfaction.
DimensionItemMeanSD
Perceived Usefulness
1.
The learning approach enriched the learning activity.
4.620.90
2.
The learning system was helpful to me in acquiring new knowledge.
4.650.69
3.
The learning mechanisms provided by the learning system smoothed the learning process.
4.270.96
4.
The learning system helped me obtain useful information when needed.
4.310.93
5.
This learning method improved my understanding.
4.380.90
6.
This learning method proved more effective than traditional computer-based learning approaches.
4.190.90
  • Dimension of Usefulness
4.400.72
Perceived Ease of use
7.
I found it easy to learn how to operate the system.
4.151.05
8.
I was able to master the use of the system in a short time.
3.961.25
9.
The activity carried out through the system was easy to understand.
4.27.1.00
10.
I picked up how to use the system very quickly.
4.270.92
11.
I experienced little difficulty using the system while engaging in the activity.
4.350.89
12.
I found the system’s interface user-friendly.
4.350.89
Dimension of Ease of Use4.231.00
Overall Satisfaction4.310.86
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wu, Y.-J.; Ding, T.-J.; Hsu, J.-C.; Ou, K.-L.; Tarng, W. Exploratory Learning of Amis Indigenous Culture and Local Environments Using Virtual Reality and Drone Technology. ISPRS Int. J. Geo-Inf. 2025, 14, 441. https://doi.org/10.3390/ijgi14110441

AMA Style

Wu Y-J, Ding T-J, Hsu J-C, Ou K-L, Tarng W. Exploratory Learning of Amis Indigenous Culture and Local Environments Using Virtual Reality and Drone Technology. ISPRS International Journal of Geo-Information. 2025; 14(11):441. https://doi.org/10.3390/ijgi14110441

Chicago/Turabian Style

Wu, Yu-Jung, Tsu-Jen Ding, Jen-Chu Hsu, Kuo-Liang Ou, and Wernhuar Tarng. 2025. "Exploratory Learning of Amis Indigenous Culture and Local Environments Using Virtual Reality and Drone Technology" ISPRS International Journal of Geo-Information 14, no. 11: 441. https://doi.org/10.3390/ijgi14110441

APA Style

Wu, Y.-J., Ding, T.-J., Hsu, J.-C., Ou, K.-L., & Tarng, W. (2025). Exploratory Learning of Amis Indigenous Culture and Local Environments Using Virtual Reality and Drone Technology. ISPRS International Journal of Geo-Information, 14(11), 441. https://doi.org/10.3390/ijgi14110441

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop