Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (3)

Search Parameters:
Keywords = Location-based Augmented Reality (LAR)

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 5549 KiB  
Article
A Proposal of In Situ Authoring Tool with Visual-Inertial Sensor Fusion for Outdoor Location-Based Augmented Reality
by Komang Candra Brata, Nobuo Funabiki, Yohanes Yohanie Fridelin Panduman, Mustika Mentari, Yan Watequlis Syaifudin and Alfiandi Aulia Rahmadani
Electronics 2025, 14(2), 342; https://doi.org/10.3390/electronics14020342 - 17 Jan 2025
Cited by 2 | Viewed by 1459
Abstract
In location-based augmented reality (LAR) applications, a simple and effective authoring tool is essential to create immersive AR experiences in real-world contexts. Unfortunately, most of the current tools are primarily desktop-based, requiring manual location acquisitions, the use of software development kits (SDKs), [...] Read more.
In location-based augmented reality (LAR) applications, a simple and effective authoring tool is essential to create immersive AR experiences in real-world contexts. Unfortunately, most of the current tools are primarily desktop-based, requiring manual location acquisitions, the use of software development kits (SDKs), and high programming skills, which poses significant challenges for novice developers and a lack of precise LAR content alignment. In this paper, we propose an intuitive in situ authoring tool with visual-inertial sensor fusions to simplify the LAR content creation and storing process directly using a smartphone at the point of interest (POI) location. The tool localizes the user’s position using smartphone sensors and maps it with the captured smartphone movement and the surrounding environment data in real-time. Thus, the AR developer can place a virtual object on-site intuitively without complex programming. By leveraging the combined capabilities of Visual Simultaneous Localization and Mapping(VSLAM) and Google Street View (GSV), it enhances localization and mapping accuracy during AR object creation. For evaluations, we conducted extensive user testing with 15 participants, assessing the task success rate and completion time of the tool in practical pedestrian navigation scenarios. The Handheld Augmented Reality Usability Scale (HARUS) was used to evaluate overall user satisfaction. The results showed that all the participants successfully completed the tasks, taking 16.76 s on average to create one AR object in a 50 m radius area, while common desktop-based methods in the literature need 1–8 min on average, depending on the user’s expertise. Usability scores reached 89.44 for manipulability and 85.14 for comprehensibility, demonstrating the high effectiveness in simplifying the outdoor LAR content creation process. Full article
Show Figures

Figure 1

20 pages, 2787 KiB  
Article
Performance Investigations of VSLAM and Google Street View Integration in Outdoor Location-Based Augmented Reality under Various Lighting Conditions
by Komang Candra Brata, Nobuo Funabiki, Prismahardi Aji Riyantoko, Yohanes Yohanie Fridelin Panduman and Mustika Mentari
Electronics 2024, 13(15), 2930; https://doi.org/10.3390/electronics13152930 - 24 Jul 2024
Cited by 5 | Viewed by 2411
Abstract
The growing demand for Location-based Augmented Reality (LAR) experiences has driven the integration of Visual Simultaneous Localization And Mapping (VSLAM) with Google Street View (GSV) to enhance the accuracy. However, the impact of the ambient light intensity on the accuracy and reliability is [...] Read more.
The growing demand for Location-based Augmented Reality (LAR) experiences has driven the integration of Visual Simultaneous Localization And Mapping (VSLAM) with Google Street View (GSV) to enhance the accuracy. However, the impact of the ambient light intensity on the accuracy and reliability is underexplored, posing significant challenges in outdoor LAR implementations. This paper investigates the impact of light conditions on the accuracy and reliability of the VSLAM/GSV integration approach in outdoor LAR implementations. This study fills a gap in the current literature and offers valuable insights into vision-based approach implementation under different light conditions. Extensive experiments were conducted at five Point of Interest (POI) locations under various light conditions with a total of 100 datasets. Descriptive statistic methods were employed to analyze the data and assess the performance variation. Additionally, the Analysis of Variance (ANOVA) analysis was utilized to assess the impact of different light conditions on the accuracy metric and horizontal tracking time, determining whether there are significant differences in performance across varying levels of light intensity. The experimental results revealed that a significant correlation (p < 0.05) exists between the ambient light intensity and the accuracy of the VSLAM/GSV integration approach. Through the confidence interval estimation, the minimum illuminance 434 lx is needed to provide a feasible and consistent accuracy. Variations in visual references, such as wet surfaces in the rainy season, also impact the horizontal tracking time and accuracy. Full article
(This article belongs to the Special Issue Perception and Interaction in Mixed, Augmented, and Virtual Reality)
Show Figures

Figure 1

24 pages, 6694 KiB  
Article
An Enhancement of Outdoor Location-Based Augmented Reality Anchor Precision through VSLAM and Google Street View
by Komang Candra Brata, Nobuo Funabiki, Yohanes Yohanie Fridelin Panduman and Evianita Dewi Fajrianti
Sensors 2024, 24(4), 1161; https://doi.org/10.3390/s24041161 - 9 Feb 2024
Cited by 5 | Viewed by 3584
Abstract
Outdoor Location-Based Augmented Reality (LAR) applications require precise positioning for seamless integrations of virtual content into immersive experiences. However, common solutions in outdoor LAR applications rely on traditional smartphone sensor fusion methods, such as the Global Positioning System (GPS) and compasses, which often [...] Read more.
Outdoor Location-Based Augmented Reality (LAR) applications require precise positioning for seamless integrations of virtual content into immersive experiences. However, common solutions in outdoor LAR applications rely on traditional smartphone sensor fusion methods, such as the Global Positioning System (GPS) and compasses, which often lack the accuracy needed for precise AR content alignments. In this paper, we introduce an innovative approach to enhance LAR anchor precision in outdoor environments. We leveraged Visual Simultaneous Localization and Mapping (VSLAM) technology, in combination with innovative cloud-based methodologies, and harnessed the extensive visual reference database of Google Street View (GSV), to address the accuracy limitation problems. For the evaluation, 10 Point of Interest (POI) locations were used as anchor point coordinates in the experiments. We compared the accuracies between our approach and the common sensor fusion LAR solution comprehensively involving accuracy benchmarking and running load performance testing. The results demonstrate substantial enhancements in overall positioning accuracies compared to conventional GPS-based approaches for aligning AR anchor content in the real world. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

Back to TopTop