Next Article in Journal
Learning-Assisted Multi-IMU Proprioceptive State Estimation for Quadruped Robots
Previous Article in Journal
Towards MR-Only Radiotherapy in Head and Neck: Generation of Synthetic CT from Zero-TE MRI Using Deep Learning
Previous Article in Special Issue
Behavioral Intentions in Metaverse Tourism: An Extended Technology Acceptance Model with Flow Theory
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Adaptive AR Navigation: Real-Time Mapping for Indoor Environment Using Node Placement and Marker Localization

by
Bagas Samuel Christiananta Putra
1,
I. Kadek Dendy Senapartha
2,*,
Jyun-Cheng Wang
3,*,
Matahari Bhakti Nendya
2,
Dan Daniel Pandapotan
4,
Felix Nathanael Tjahjono
2 and
Halim Budi Santoso
5
1
Department of Management, Faculty of Economics and Business, Universitas Kristen Duta Wacana, Yogyakarta 55224, Indonesia
2
Department of Informatics, Faculty of Information Technology, Universitas Kristen Duta Wacana, Yogyakarta 55224, Indonesia
3
Institute of Service Science, College of Technology Management, National Tsing Hua University, Hsinchu 300044, Taiwan
4
Department of Product Design, School of Architecture and Design, Universitas Kristen Duta Wacana, Yogyakarta 55224, Indonesia
5
Department of Information Systems, Faculty of Information Technology, Universitas Kristen Duta Wacana, Yogyakarta 55224, Indonesia
*
Authors to whom correspondence should be addressed.
Information 2025, 16(6), 478; https://doi.org/10.3390/info16060478
Submission received: 31 March 2025 / Revised: 31 May 2025 / Accepted: 5 June 2025 / Published: 7 June 2025

Abstract

Indoor navigation remains a challenge due to the limitations of GPS-based systems in enclosed environments. Current approaches, such as marker-based ones, have been developed for indoor navigation. However, it requires extensive manual mapping and makes indoor navigation time-consuming and difficult to scale. To enhance current approaches to indoor navigation, this study proposes a node-based mapping for indoor navigation, allowing users to dynamically construct navigation paths using a mobile device. The system leverages NavMesh, the A* algorithm for pathfinding, and is integrated into the ARCore for real-time AR guidance. Nodes are placed within the environment to define walkable paths, which can be stored and reused without requiring a full system to rebuild. Once the prototype has been developed, usability testing is conducted using the Handheld Augmented Reality Usability Scale (HARUS) to evaluate manipulability, comprehensibility, and overall usability. This study finds that using node-based mapping for indoor navigation can help enhance flexibility in mapping new indoor spaces and offers an effective AR-guided navigation experience. However, there are some areas of improvement, including interface clarity and system scalability, that can be considered for future research. This study contributes practically to improving current practices in adaptive indoor navigation systems using AR-based dynamic mapping techniques.

1. Introduction

Indoor mapping includes in-building system navigation, asset management, and security systems. An indoor navigation system tracks the user’s position, plots a route that can be taken, and guides them through a predetermined path to reach their destination [1]. In environments such as shopping centers, hospitals, campuses, and office buildings, accurate room mapping can improve operational efficiency and make it faster and easier to navigate within the indoor environment [2]. In emergency situations, indoor mapping plays an important role in supporting evacuation planning and providing real-time guidance, potentially saving lives. Augmented Reality (AR)-based evacuation systems enable the delivery of optimal exit path directions via smartphone devices, with the help of virtual agents that intuitively guide users to exit complex buildings quickly and safely [3]. In addition, facility management teams rely on detailed indoor maps to monitor assets, manage space utilization, and support efficient maintenance workflows. In the era of smart buildings and digital twin technology, accurate and dynamic indoor maps are a crucial element in improving safety, operational efficiency, and overall user experience.
Compared to outdoor navigation, indoor navigation, such as enclosed space mapping, requires different techniques to enhance the individual experience. Based on experimental results, a previous study has revealed that the Global Positioning System (GPS) is not suitable for enclosed areas, unlike open space-] mapping, which relies on technologies such as Global Positioning System (GPS), as satellite signals are attenuated by more than 1 dB per meter of structure [4]. Enclosed space mapping requires a specialized approach using sensors, spatial data processing, and artificial intelligence-based technologies.
While GPS and Global Navigation Satellite Systems (GNSS) are effective for outdoor environments, their accuracy degrades indoors due to signal attenuation, multipath interference, and lack of direct line of sight to the satellites. Previous studies have shown that GPS is unable to provide accurate indoor tracking because of factors unrelated to line-of-sight limitations [1]. To handle such problems, many studies have been conducted to provide various alternative methods and approaches for indoor navigation [1,5,6]. Augmented Reality (AR) is a technology that incorporates computer-generated images into a real environment [7]. The virtual objects are augmented in the real world and present accurate 3D identification. One example of AR usage involves enhancing the user’s perceived field of vision by projecting a photo in a limited area or displaying additional information on the AR screen of a smartphone or tablet [8] Hence, AR can provide real-time data based on alternative routes and locations. Multimodal interaction provided by immersive technologies, such as AR, VR, and XR, can enable digital sensory experience and enhance users’ interaction [9]. With the importance of multimodal interactions of immersive technology and the ability of AR to display information accordingly, AR is expected to advance indoor mapping navigation.
Extant studies have explored various alternative methods for indoor navigation using Wi-Fi fingerprinting, Bluetooth Low Energy (BLE) beacons, and Simultaneous Localization and Mapping (SLAM) techniques. However, prior studies still have some remaining questions that should be answered in future studies to advance indoor navigation systems. Further, some researchers combine the interactivity of immersive technology, such as AR, to develop AR-based indoor navigation systems [1,6]. A wearable Augmented Reality system has been presented, which is capable of recognizing locations and displaying captions in the user’s view with geographic information. The system consists of various separate sensors and portable computing devices, although it is still relatively complicated to use. The development aims to create a system that can display digital location or object-based information, both indoors and outdoors, while allowing users to augment and extend that information [10]. AR-based navigation systems provide an intuitive and immersive approach by presenting digital information integrated into the physical environment. However, many AR navigation methods rely on predefined landmarks, static 3D models, or complex SLAM-based localization [11]. These approaches can limit the ability to evolve and require significant pre-processing efforts [12].
Various approaches have been developed to improve indoor navigation based on Augmented Reality (AR). These include system design using Unified Modified Language (UML) [13], presenting a framework that should allow editing and viewing UML models in 3D space through Microsoft Hololens AR Glasses [14]. Marker-based tracking using QR codes, mapping supported by point clouds with LiDAR sensors, and pathfinding based on NavMesh in pre-designed virtual models are methods that have proven effective [15]. However, each of these methods has limitations. Tracking methods using QR codes or fiducial markers are often challenging, especially in large or frequently changing areas, as they require physical installation and complicated maintenance [15]. SLAM systems require high computing power and challenging real-time navigation on limited devices such as smartphones [16].
Additionally, the use of LiDAR sensors increases complexity and power consumption. Systems using NavMesh require time-consuming manual modeling [16]. Methods based on 3D models and static markers are less adaptive to changes, often requiring updates. Although SLAM is more adaptive, continuous mapping still has efficiency and accuracy issues [17]. Although some methods and algorithms have been developed, there are some remaining challenges in providing a compelling, interactive, and flexible scale.
This study aims to propose a novel indoor navigation method based on AR mobile applications and answer the following research questions: (1) What are the advantages of node-based AR-based indoor navigation? (2) What are the users’ responses to the developed prototype regarding manipulability and comprehensibility? Our approach is expected to enhance flexibility and scalability and reduce time consumption. Users can create navigation paths by placing virtual nodes on mobile devices without needing an existing environment model. The system uses Google ARCore for real-time spatial recognition, NavMesh, and the A* algorithm for pathfinding. The generated nodes show walkable areas and can be freely connected. In addition, the resulting map can be saved and reused without the need to rebuild the system.
Then, researchers conducted a system evaluation using the Handheld Augmented Reality Usability Scale (HARUS) [18], which showed that the system has high usability with an average score of 81.98. This number is considered to be excellent. Furthermore, this study found that the proposed node-based mapping technique can improve AR-based indoor navigation by easing usage, improving adaptability, and providing real-time path-finding capabilities. This research significantly contributes to AR-based indoor navigation and wayfinding systems by offering customizable and scalable mapping solutions for various indoor environments. However, future research has key issues to leverage the current approach, including real-time processing efficiency, multi-user collaboration in AR navigation, and integration of AI-based devices to support autonomous navigation.
This article is arranged as follows; The next section will deliver the materials and methods to develop the AR-based indoor navigation, including the design principles and the development of a node-based mapping system. Section two includes the prototype approach, research methodology, and statistical analysis. Then, the next section will introduce the system functionality and usability testing methodology to test the proposed systems. Lastly, this article concludes the study and proposes future research directions to advance the current methods in developing AR-based indoor navigation.

2. Materials and Methods

This study adopts a prototype-based, user-centered design methodology that emphasizes iterative development, user involvement, and evaluation. The research process was conducted in four main stages: (1) methodological framework design, (2) system design and development, (3) usability testing, and (4) statistical analysis. This section outlines each stage in detail to provide a clear and structured understanding of the methodology employed. As shown in Table 1, the research process consisted of three key phases: requirement definition, prototyping, and evaluation.

2.1. Research Methodology Overview

The research follows a user-centered prototyping approach structured in three key phases.
This iterative approach ensured alignment between technical features and user needs and allowed system refinement before wider deployment or performance testing.

2.2. System Design and Development

The proposed AR-based indoor navigation system is built upon four core components: (1) an Augmented Reality (AR) Framework that enables interactive visual guidance, (2) a pathfinding algorithm to determine the most efficient routes, (3) a node-based mapping structure that represents the indoor layout, and (4) a marker-based position initialization method to accurately identify the user’s starting location within the space. The system was designed to overcome the limitations of static NavMesh-based navigation, enabling faster deployment in new building environments without the need for complete remodeling. The AR-based navigation system was developed based on four key components:
  • Augmented Reality (AR) Framework—The system uses Google ARCore, enabling real-time plane detection, spatial anchoring, and camera-based tracking. This framework provides real-time plane detection, enabling the system to dynamically recognize floors and surfaces where navigation nodes can be placed. The framework also supports Android and iOS, allowing wider adoption and deployment and motion tracking features to ensure accurate user positioning without external hardware (e.g., LiDAR sensors or BLE beacons). ARCore is integrated into Unity3D as part of the AR foundation package, which allows seamless development of cross-platform AR applications. The system continuously updates the user’s position and surroundings, ensuring a responsive navigation experience.
  • Pathfinding Algorithm—The navigation paths are generated using Navigation Mesh (NavMesh), while route optimization is performed using the A algorithm*, which computes the shortest path between user-defined nodes. NavMesh provides a walkable area representation, eliminating manual 3D modeling of indoor spaces. This framework also supports multi-floor navigation by connecting staircase nodes and is optimized for real-time use in mobile applications, requiring less computational power than SLAM-based navigation. For pathfinding, the A* Algorithm is widely recognized for fast and optimal route calculations in graph-based navigation. It guarantees the shortest path with minimal computation, making it suitable for mobile AR applications where efficiency is critical. NavMesh and A* work together to dynamically compute the best navigation path based on user-defined nodes, ensuring scalability and adaptability to different indoor environments.
  • Node-Based Mapping System—Unity3D creates a node-based mapping system. Unity3D is a real-time 3D development platform that provides native support for ARCore and NavMesh navigation. This platform offers extensive optimizations for mobile AR applications, ensuring efficient performance even on mid-range smartphones. JSON-based data storage is created to store node data for the system. JSON is a lightweight and human-readable format that allows easy import/export of navigation maps. It stores node positions, connectivity data, and marker GUIDs, reusing navigation maps without requiring rebuilding or remapping. Using Unity3D + JSON, the system minimizes storage overhead, allowing users to load, edit, and share their navigation maps across multiple devices.
  • Marker-Based Position Initialization—While the use of QR codes for localization is nothing new, this method remains a practical and affordable solution. Without the need for additional hardware, QR codes allow users to find their starting position easily, making it perfect for use in lightweight and efficient mobile AR systems. For the implementation, ZXing.Net is used because it is an open-source QR code generation and provides a scanning library optimized for C# and Unity3D. QR codes are hardware-independent, requiring only a mobile camera for scanning, avoiding additional Bluetooth or RFID-based localization solutions. Each Marker Node in the system generates a unique QR code associated with a GUID. When a user scans a QR code, their position is instantly initialized to the corresponding marker location, ensuring accurate navigation of starting points.
The application was developed in Unity3D with C# scripting and tested on Android devices running Android SDK 24 or higher. The AR functionalities rely on ARCore’s session management, which enables real-time scene understanding. As detailed in Table 2, the ARCore session includes steps such as mapping initialization, map saving, and navigation process.

2.3. Usability Testing Procedure

A Handheld Augmented Reality Usability Scale (HARUS) [18] assessment was conducted to evaluate the system’s usability and effectiveness. HARUS is a standardized usability instrument that evaluates user experiences in mobile or handheld AR applications. HARUS is developed to address some limitations of general usability scales, such as the System Usability Scale (SUS), in the context of spatial interaction, real-time feedback, and physical constraints in AR systems. This instrument measures users’ perceived ease of interaction (manipulability) and clarity of information (comprehensibility) when using handheld AR applications.
The research teams recruited the participants. Participation was voluntary. Informed consent was received before the data collection. No private data is stored, and individual responses are based on anonymity. A total of 30 participants, consisting of undergraduate informatics students, were invited to test the system. As shown in Table 3, participants completed a series of tasks simulating real-world usage of the NavARNode application.
After completing the tasks, participants completed the HARUS questionnaire, which contains 16 statements rated on a 7-point Likert scale. The statements were grouped into:
  • Manipulability (Questions 1–8): assessing comfort, ease of interaction, and physical effort.
  • Comprehensibility (Questions 9–16): assessing clarity, information design, and visual consistency.

2.4. Statistical Analysis

HARUS scores were computed using the methodology proposed by Santos [18]. The HARUS score is scaled using the same formula as the System Usability Scale (SUS). After adjusting each of the 16 items, odd-numbered as 7 − Qi and even-numbered as i − 1. The scores are summed and then scaled to a 0–100 range using the formula shown in Equation (1).
H A R U S   S c o r e = i = 1 16 7 Q i ,   if   i   is   odd   ( negatively   worded i 1 ,   if   is   even   p o s i t i v e l y   w o r d e d ÷ 0.96
Descriptive statistics, including mean, minimum, maximum, and standard deviation, were used to analyze the individual scores for each HARUS question. These statistical measures provided insights into response variability and consistency across participants. Additionally, average scores were calculated separately for the manipulability and comprehensibility of subscales, as well as for the overall usability score, offering a more detailed understanding of user experience with the system. HARUS employs the same calculation method as the System Usability Scale (SUS) and subsequently uses the SUS score interpretation scale to analyze the results similarly [19]. HARUS uses the SUS scoring method to ensure consistency, familiarity, and comparability across usability studies. While addressing HAR-specific issues like ergonomics and perception, it retains SUS’s 0–100 scale to allow benchmarking, feature comparison, and iteration tracking, making it both specialized and broadly interpretable [18].
Figure 1 presents an interpretation scale for System Usability Scale (SUS) [20], illustrating how SUS scores (ranging from 0 to 100) can be interpreted in terms of acceptability, adjective ratings, and grade scales. Scores below 70 are categorized as “Not Acceptable,” while scores above 70 falls into the “Acceptable” range. Adjective ratings provide qualitative labels such as “Awful” for very low scores, “Poor” around 50, “Okay” around 65, “Good” near 75, and “Excellent” above 85. In terms of grade scale, scores are classified from F (below 50) to A (above 85), with intermediate scores earning D, C, or B grades. The scale also highlights key statistics, including a minimum score of 55.0, a mean of 76.8, and a maximum of 92.5, resulting in a usability range of 37.5. This interpretation model helps evaluate and communicate usability levels in a more intuitive and standardized way.

3. Results

This section provides some system functionality and the results of usability testing for our mobile application prototype. Usability testing was evaluated using HARUS, mainly covering manipulability and comprehensibility [18].

3.1. System Functionality

This study developed a mobile application prototype, namely NavARNode. There are some indoor mapping features and navigation tasks that are able to run on mobile devices using runtime node placement. The prototype system is shown in Figure 2. Figure 2 also shows how users flow through the NavARNode application during the node-based mapping process for indoor navigation. The process begins on the “Pindai Lantai (Scan Floor)” screen, where the user selects or adds a building and proceeds by tapping the “Mulai (Start)” button. This process shows how users should be able to add maps of floors and rooms available in the building/indoor environment. It also shows how users should navigate the building detail screen, where they input the building name and description of the locations. Users are also able to define location markers.
Then, users start to map the building location into the AR application. They select the menu “Pemetaan (mapping).” Users should define the floor surfaces using plane detection and begin to place path nodes on the detected surface using tap gestures. Each node should connect with the other nodes. Connecting those nodes can help provide pathfinding and directions to specific rooms/locations. Users should be able to place marker nodes at particular locations, naming every node accordingly. These markers also serve as starting points or final destinations for indoor navigation.
Following the mapping tasks, systems can generate a NavMesh overlay automatically (see Figure 2, connected nodes). NavMesh helps visualize the traversable areas and differentiates them from the original floor. Then, the application stores the map, enabling future navigation using predefined nodes. The system can start functioning once the node’s navigation has been set up, as shown in Figure 3.
Figure 3 demonstrates the navigation workflow of the NavARNode application. Users begin on the building selection screen, choosing from a list of previously saved buildings, or adding a new one. After selecting a building, users access the building’s detail page, which provides options for starting the mapping process or initiating navigation. To start the navigation features, users tap the “Marker” button to view a list of available destinations using QR codes. Upon selecting a marker, the user is directed to the scanner interface to scan the printed QR code in the physical environment. By scanning the QR code, users initiate the navigation process and provide the starting point to navigate using AR mobile applications. Once initialized, the navigation interface displays a virtual guide path leading to the selected destination, rendered using AR overlays such as directional arrows (Please see Figure 3). Further details on the system’s operation are available in the Supplementary Materials.

3.2. Usability Testing Using HARUS

A total of 30 participants took part in the usability evaluation. All participants were undergraduate students aged between 20 and 25 years, consisting of 16 males (66.67%) and 8 females (33.33%). Most respondents were familiar with mobile devices and AR-based applications but had varying experiences with indoor navigation systems. This demographic was selected to reflect the target user group—university students navigating unfamiliar campus environments.
Before the usability evaluation, participants were asked to enter a building name and description into the system. They were then instructed to explore the NavARNode application and complete a set of five predefined task scenarios designed to simulate the real-world use of the indoor navigation system. Table 4 presents the demographic breakdown of participants based on gender and their familiarity with AR applications.
The evaluation results highlight the functionality and usability performance of the NavARNode application. The system enabled users to perform real-time, node-based mapping using mobile AR technologies. Key features included surface detection via AR plane recognition, placement and linking of path nodes to define navigable routes, creation of marker nodes representing destination points, and the ability to save or load mapping sessions using serialized json files.
Usability testing was conducted using the Handheld Augmented Reality Usability Scale (HARUS). Participants responded to 16 statements on a 7-point Likert scale, grouped into two core dimensions: Manipulability (Q1–Q8) and Comprehensibility (Q9–Q16). The responses were transformed into values ranging from 0 to 100 using the standard HARUS formula. Table 5 summarizes the average scores obtained for each usability dimension.
As shown in Table 2, the average score for Manipulability was 80.90, while Comprehensibility achieved a slightly higher score of 83.06, resulting in an overall usability score of 81.98. The average HARUS score obtained through the usability evaluation was 80.90. Based on the System Usability Scale (SUS) interpretation framework in Figure 1, this score falls within the “Excellent” category and corresponds to a Grade B. It also surpasses the commonly accepted usability benchmark of 68 [21], indicating that the system has high functionality and is acceptable to users. These results suggest that the NavARNode application’s marker-based AR approach is practical and acceptable, offering a satisfactory user experience for mobile indoor navigation tasks.

4. Discussion

4.1. Node-Based Indoor Navigation Using AR Mobile Applications

This study demonstrates a novel idea on how a node-based approach can be integrated into AR mobile applications, offering flexible and adaptive indoor navigation. Users can dynamically place and connect nodes during runtime, allowing users to define navigation directly without needing prebuilt 3D models and other virtual objects. Although the A* algorithm is well known, its application in real-time point placement in the ARCore environment opens up opportunities for practical pathfinding solutions. This approach is not only efficient and easy to run on mobile devices but also does not require a pre-existing 3D map.
This model offers a higher degree of flexibility, particularly where the layout frequently changes, or detailed floor plans are unavailable. A node-based AR for indoor navigation also proposes modularity in navigation logic (Figure 1), where each node acts as a discrete spatial data point that can be easily updated, removed, or reconnected to support different routes. Indeed, the proposed system enables individual participants to map the indoor environment and reduces dependency on operators.
Moreover, using location markers for initialization and destination targeting simplifies user orientation and reduces dependency on complex tracking systems like SLAM or GPS, which are often unreliable indoors. Although ARCore uses visual SLAM behind the scenes, the proposed system simplifies the process for users. Users can create mappings intuitively, while the application is running by simply placing nodes, without the need to understand or set up advanced modeling or complex SLAM configurations.
The system architecture is also designed to enhance the scalability and accessibility of AR navigation through connected nodes. Hence, this system is expected to ease and empower non-technical users, such as building managers, educators, or non-IT staff.

4.2. User Response Regarding Indoor Navigation Using AR Mobile Applications

The findings from the HARUS usability evaluation demonstrate that the NavARNode application presents a promising solution for mobile indoor navigation using Augmented Reality (AR) technologies. With an overall usability score of 81.98, the system falls into the “Excellent” category according to the System Usability Scale (SUS) interpretation and exceeds the benchmark score of 68. This indicates that the application is functional and well-received by its target users—university students navigating unfamiliar indoor environments.
The dimension of comprehensibility achieved the highest average score at 83.06, indicating that most users found the system easy to understand and interact with. However, one HARUS item under this dimension— “I found the information displayed on the screen to be confusing” (Q13)— received the lowest individual score within its category, with a minimum score of 1 and a relatively low average. This suggests that some users had trouble interpreting on-screen instructions, likely due to the absence of clear visual cues or descriptive labels, especially in the mapping interface. To improve the comprehensibility of the system, it is recommended that tooltips, button labels, and step-by-step interactive guides be introduced to support first-time users and reduce cognitive load.
Meanwhile, the manipulability dimension scored 80.90, which reflects strong performance in terms of ease of interaction. However, items related to physical comfort during use received notably lower scores. Specifically, Q1— “Interacting with this application requires a lot of physical effort”—and Q3— “I found the device difficult to hold while using the application”—reported minimum scores of 0 and 1, respectively. These responses highlight the physical strain some users experienced, likely due to the need for continuous device orientation and floor scanning during the mapping process. However, relying on QR code scanning to determine the starting position can be cumbersome for users, especially in crowded or changing environments. In the future, hybrid methods that combine visual markers with ambient signals—such as BLE beacons or Wi-Fi fingerprinting—could be a solution to provide a more seamless and convenient localization experience. This challenge is common in handheld AR applications and underscores the need to optimize the interaction flow. Future improvements could include implementing voice commands, automatic plane detection, or gesture-based input to reduce the need for prolonged physical effort.
A key innovation of NavARNode lies in its ability to support runtime environment mapping, enabling users to place dynamically and link navigation nodes and markers during application use—rather than relying on pre-modeled, static environments. This flexibility represents a major advancement over earlier AR indoor navigation systems. However, increased flexibility also introduces added complexity, which may pose challenges for novice users. Simplifying the user interface and integrating an onboarding wizard or contextual hints could help bridge this gap and enhance usability for less experienced users.
Overall, the HARUS results indicate that the system delivers a high level of user satisfaction, supported by evidence from both quantitative scores and qualitative feedback. While the application demonstrates strong usability in its current form, targeted improvements—particularly in user guidance, interface clarity, and interaction design—will ensure broader adoption and improved user comfort in real-world deployment scenarios.

5. Conclusions

NavARNode, a mobile-based indoor navigation system that combines node-based mapping with Augmented Reality (AR) technology, has been successfully developed, and its usefulness has been evaluated in this study. The program offers more flexibility than traditional AR navigation systems restricted to pre-modeled surroundings by allowing users to dynamically generate navigation environments during runtime by positioning and connecting nodes and markers. The Handheld Augmented Reality Usability Scale (HARUS) usability test indicated an excellent degree of usability, which yielded an overall score of 81.98. High average ratings of 80.90 and 83.06 for the manipulability and comprehensibility categories indicate that users perceived the system as user-friendly and intuitive.
Notwithstanding these encouraging outcomes, the assessment also highlighted areas needing work. Some users complained about physical pain during prolonged use, especially during the scanning and mapping procedure, and confusion regarding the on-screen interface. These results show that the user interface and interaction design must be further enhanced to improve comfort and usefulness. In future developments, more detailed visual cues, interactive training, and user-friendly interface components will all be added to enhance the user experience. Physical strain during use may be decreased by optimizing the interaction flow with voice commands or gesture-based controls. Enhancing system robustness and generalizability will also require extending usability testing to more varied user groups and performing performance assessments in various interior situations. With these enhancements, NavARNode has the potential to become a reliable and user-friendly indoor navigation platform for various applications, including education, tourism, and public infrastructure.

Supplementary Materials

The following supporting information can be watched at: Demo Video: https://youtu.be/NUJHHOnJKt0?si=9x-gXAhQNW25UFC3 (accessed on 1 June 2025).

Author Contributions

Conceptualization, I.K.D.S., J.-C.W. and M.B.N.; Methodology, B.S.C.P. and H.B.S.; Software Development, M.B.N., D.D.P. and F.N.T.; Data Collection and Analysis, F.N.T., I.K.D.S. and M.B.N.; Writing—Original Draft Preparation, all authors; Review and editing—all authors. All authors have read and agreed to the published version of the manuscript.

Funding

The research team gratefully acknowledges the financial support provided by the Institute of Research and Community Service (LPPM) at Universitas Kristen Duta Wacana through Research Grant Scheme No. 74/D.01/LPPM/2024.

Institutional Review Board Statement

This study was reviewed and approved by the Health Research Ethics Committee of the Faculty of Medicine, Universitas Kristen Duta Wacana, Yogyakarta, Indonesia. The ethical clearance was granted under approval number 1749/C.16/FK/2024, and is valid for one year from the date of approval (29 April 2024).

Informed Consent Statement

Informed Consent was obtained from all subjects involved in the study and participation in this study was voluntary.

Data Availability Statement

The datasets used and/or analyzed during the current study are available from the corresponding authors upon reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Alkady, Y.; Rizk, R.; Alsekait, D.M.; Alluhaidan, A.S.; Abdelminaam, D.S. SINS_AR: An Efficient Smart Indoor Navigation System Based on Augmented Reality. IEEE Access 2024, 12, 109171–109183. [Google Scholar] [CrossRef]
  2. Hořejší, P.; Macháč, T.; Šimon, M. Reliability and Accuracy of Indoor Warehouse Navigation Using Augmented Reality. IEEE Access 2024, 12, 94506–94519. [Google Scholar] [CrossRef]
  3. Gan, Q.; Liu, Z.; Liu, T.; Chai, Y. An indoor evacuation guidance system with an AR virtual agent. Procedia Comput. Sci. 2022, 213, 636–642. [Google Scholar] [CrossRef]
  4. Yi, T.-H.; Li, H.-N.; Gu, M. Effect of Different Construction Materials on Propagation of GPS Monitoring Signals. Measurement 2012, 45, 1126–1139. [Google Scholar] [CrossRef]
  5. Valizadeh, M.; Ranjgar, B.; Niccolai, A.; Hosseini, H.; Rezaee, S.; Hakimpour, F. Indoor Augmented Reality (AR) Pedestrian Navigation for Emergency Evacuation Based on BIM and GIS. Heliyon 2024, 10, e32852. [Google Scholar] [CrossRef] [PubMed]
  6. Ahn, Y.; Choi, H.; Choi, R.; Ahn, S.; Kim, B.S. BIM-Based Augmented Reality Navigation for Indoor Emergency Evacuation. Expert Syst. Appl. 2024, 255, 124469. [Google Scholar] [CrossRef]
  7. Rakkolainen, I.; Farooq, A.; Kangas, J.; Hakulinen, J.; Rantala, J.; Turunen, M.; Raisamo, R. Technologies for Multimodal Interaction in Extended Reality—A Scoping Review. Multimodal Technol. Interact. 2021, 5, 81. [Google Scholar] [CrossRef]
  8. Bibbo, L.; Bramanti, A.; Sharma, J.; Cotroneo, F. AR Platform for Indoor Navigation: New Potential Approach Extensible to Older People with Cognitive Impairment. BioMedInformatics 2024, 4, 1589–1619. [Google Scholar] [CrossRef]
  9. Bhowmik, A.K. Virtual and Augmented Reality: Human Sensory-Perceptual Requirements and Trends for Immersive Spatial Computing Experiences. J. Soc. Inf. Disp. 2024, 32, 605–646. [Google Scholar] [CrossRef]
  10. Mahapatra, T.; Tsiamitros, N.; Rohr, A.M.; K, K.; Pipelidis, G. Pedestrian Augmented Reality Navigator. Sensors 2023, 23, 1816. [Google Scholar] [CrossRef] [PubMed]
  11. Ayyanchira, A.; Mahfoud, E.; Wang, W.; Lu, A. Toward Cross-Platform Immersive Visualization for Indoor Navigation and Collaboration with Augmented Reality. J. Vis. 2022, 25, 1249–1266. [Google Scholar] [CrossRef]
  12. Zollmann, S.; Langlotz, T.; Grasset, R.; Lo, W.H.; Mori, S.; Regenbrecht, H. Visualization Techniques in Augmented Reality: A Taxonomy, Methods and Patterns. IEEE Trans. Vis. Comput. Graph. 2020, 27, 3808–3825. [Google Scholar] [CrossRef] [PubMed]
  13. Yigitbas, E.; Gorissen, S.; Weidmann, N.; Engels, G. Design and Evaluation of a Collaborative UML Modeling Environment in Virtual Reality. Softw. Syst. Model. 2023, 22, 1397–1425. [Google Scholar] [CrossRef] [PubMed]
  14. Pratomo, D.G.; Khomsin; Aditya, N. MARS: An Augmented Reality-Based Marine Chart Display System. Int. J. Geoinform. 2023, 19, 21–31. [Google Scholar] [CrossRef]
  15. Nendya, M.B.; Mahastama, A.W.; Setiadi, B. Augmented Reality Indoor Navigation Using NavMesh. In Proceedings of the 2023 1st IEEE International Conference on Smart Technology (ICE-SMARTec), Bandung, Indonesia, 17–19 July 2023; IEEE: New York, NY, USA; pp. 134–139. [Google Scholar] [CrossRef]
  16. Zhou, Z.; Feng, X.; Di, S.; Zhou, X. A LiDAR Mapping System for Robot Navigation in Dynamic Environments. IEEE Trans. Intell. Veh. 2023, 1–20. [Google Scholar] [CrossRef]
  17. Wen, T.; Liu, Z.; Lu, B.; Fang, Y. Scaffold-SLAM: Structured 3D Gaussians for Simultaneous Localization and Photorealistic Mapping. arXiv 2025. [Google Scholar] [CrossRef]
  18. Santos, M.E.C.; Taketomi, T.; Sandor, C.; Polvi, J.; Yamamoto, G.; Kato, H. A Usability Scale for Handheld Augmented Reality. In Proceedings of the 20th ACM Symposium on Virtual Reality Software and Technology (VRST ’14), Edinburgh, UK, 11–13 November 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 167–176. [Google Scholar]
  19. Nendya, M.B.; Mahastama, A.W.; Setiadi, B. A Usability Study of Augmented Reality Indoor Navigation Using Handheld Augmented Reality Usability Scale (HARUS). CogITo Smart J. 2024, 10, 326–338. [Google Scholar] [CrossRef]
  20. Nik Ahmad, N.A.; Hasni, N.S. ISO 9241-11 and SUS Measurement for Usability Assessment of Dropshipping Sales Management Application. In Proceedings of the 2021 10th International Conference on Software and Computer Applications, Kuala Lumpur, Malaysia, 23–26 February 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 70–74. [Google Scholar]
  21. Gullà, F.; Menghi, R.; Papetti, A.; Carulli, M.; Bordegoni, M.; Gaggioli, A.; Germani, M. Prototyping Adaptive Systems in Smart Environments Using Virtual Reality. Int. J. Interact. Des. Manuf. 2019, 13, 597–616. [Google Scholar] [CrossRef]
Figure 1. Interpretation scale for System Usability Scale (SUS) scores [19].
Figure 1. Interpretation scale for System Usability Scale (SUS) scores [19].
Information 16 00478 g001
Figure 2. User flow of the NavARNode application during the indoor mapping process.
Figure 2. User flow of the NavARNode application during the indoor mapping process.
Information 16 00478 g002
Figure 3. The navigation flow of the NavARNode application during indoor navigation.
Figure 3. The navigation flow of the NavARNode application during indoor navigation.
Information 16 00478 g003
Table 1. Phases of Research Methodology.
Table 1. Phases of Research Methodology.
PhaseActivitiesDescription
1.
Requirement Definition
a.
Observational analysis of existing AR indoor navigation limitations
b.
User needs analysis
c.
Functional requirements
Identified challenges in current systems and defined essential features such as real-time node placement, map storage, and AR path visualization.
2.
System Prototyping and Implementation
a.
Prototype development and internal testing
b.
Iterative UI/UX refinement based on pilot feedback
Prototypes were created using Unity, tested internally for AR tracking and NavMesh, and refined through feedback from pilot users.
3.
Evaluation Phase (Usability Testing)
a.
Formal testing with 30 students
b.
Real-world usage simulation
c.
Usability and interaction pattern analysis
Conducted structured usability tests using HARUS to assess manipulability and comprehensibility in actual indoor environments.
Table 2. ARCore Session Management for Real-Time Scene Understanding.
Table 2. ARCore Session Management for Real-Time Scene Understanding.
NoStagesDescription
1Node Placement and Map Construction
1aMapping initializationThe user scans the floor using ARCore plane detection.
1bPath Node PlacementUsers tap the screen to place nodes in important locations.
1cNode LinkingPath nodes are connected to each other to form a navigation system.
1dMarker Node PlacementThe Marker Node is placed and generates a QR code with a unique GUID.
1eMap StorageThe map is saved in JSON format for reuse.
2The stored JSON file contains
2aNode PositionPosition coordinates of each node in X, Y, and Z formats.
2bNode TypeInformation on whether the node is a Path or a Marker.
2cConnectivity DataA list of adjacencies that represent connections between nodes for pathfinding.
2dGUID MarkerA unique ID for each marker node.
3Navigation System
3aNavigation initializationThe user scans the QR code to determine the starting position.
3bInvocation of navigation mapThe system retrieves the map from storage.
3dRoute calculationThe system calculates the optimal route using the A* (A star) algorithm over the NavMesh graph.
3cAR navigation guideThe app displays a virtual arrow to direct the user to the destination in the AR view.
Table 3. Tasks performed by participants using NavARNode.
Table 3. Tasks performed by participants using NavARNode.
StepActionDescription
1Initialize AR SessionStart an AR session and scan the floor surface using ARCore’s plane detection capabilities.
2Place Path NodesTap the screen to place navigation nodes at key locations.
3Add Marker NodesInsert and label marker nodes to represent target destinations.
4Generate and Scan QR CodesGenerate QR codes for marker nodes and scan them to initialize the user’s starting position.
5Follow AR NavigationUse the AR interface with virtual arrows to follow the optimal route toward the selected destination.
Table 4. Demographic characteristics of participants involved in the usability testing of the NavARNode application.
Table 4. Demographic characteristics of participants involved in the usability testing of the NavARNode application.
NoGenderFamiliarity with AR Application
YesNot YetPercentage (%)
1.Male16466.67
2.Female8233.33
Total Participant246100
Table 5. Summary of HARUS usability evaluation results for the NavARNode application.
Table 5. Summary of HARUS usability evaluation results for the NavARNode application.
NoUsability DimensionAverage Score
1.Manipulability80.90
2.Comprehensibility83.06
3.Overall Usability Score81.98
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Putra, B.S.C.; Senapartha, I.K.D.; Wang, J.-C.; Nendya, M.B.; Pandapotan, D.D.; Tjahjono, F.N.; Santoso, H.B. Adaptive AR Navigation: Real-Time Mapping for Indoor Environment Using Node Placement and Marker Localization. Information 2025, 16, 478. https://doi.org/10.3390/info16060478

AMA Style

Putra BSC, Senapartha IKD, Wang J-C, Nendya MB, Pandapotan DD, Tjahjono FN, Santoso HB. Adaptive AR Navigation: Real-Time Mapping for Indoor Environment Using Node Placement and Marker Localization. Information. 2025; 16(6):478. https://doi.org/10.3390/info16060478

Chicago/Turabian Style

Putra, Bagas Samuel Christiananta, I. Kadek Dendy Senapartha, Jyun-Cheng Wang, Matahari Bhakti Nendya, Dan Daniel Pandapotan, Felix Nathanael Tjahjono, and Halim Budi Santoso. 2025. "Adaptive AR Navigation: Real-Time Mapping for Indoor Environment Using Node Placement and Marker Localization" Information 16, no. 6: 478. https://doi.org/10.3390/info16060478

APA Style

Putra, B. S. C., Senapartha, I. K. D., Wang, J.-C., Nendya, M. B., Pandapotan, D. D., Tjahjono, F. N., & Santoso, H. B. (2025). Adaptive AR Navigation: Real-Time Mapping for Indoor Environment Using Node Placement and Marker Localization. Information, 16(6), 478. https://doi.org/10.3390/info16060478

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop