Next Article in Journal
OSM Data Import as an Outreach Tool to Trigger Community Growth? A Case Study in Miami
Next Article in Special Issue
Validity of VR Technology on the Smartphone for the Study of Wind Park Soundscapes
Previous Article in Journal
WebGIS for Geography Education: Towards a GeoCapabilities Approach
Previous Article in Special Issue
Social Force Model-Based Group Behavior Simulation in Virtual Geographic Environments
Open AccessArticle

An Indoor Scene Recognition-Based 3D Registration Mechanism for Real-Time AR-GIS Visualization in Mobile Applications

by 1, 1,2,*, 1,2,*, 1,2 and 1
State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing (LIESMARS), Wuhan University, 129 Luoyu Road, Wuhan 430079, China
Collaborative Innovation Center of Geospatial Technology, Wuhan University, Wuhan 430079, China
Authors to whom correspondence should be addressed.
ISPRS Int. J. Geo-Inf. 2018, 7(3), 112;
Received: 6 February 2018 / Revised: 10 March 2018 / Accepted: 14 March 2018 / Published: 15 March 2018
Mobile Augmented Reality (MAR) systems are becoming ideal platforms for visualization, permitting users to better comprehend and interact with spatial information. Subsequently, this technological development, in turn, has prompted efforts to enhance mechanisms for registering virtual objects in real world contexts. Most existing AR 3D Registration techniques lack the scene recognition capabilities needed to describe accurately the positioning of virtual objects in scenes representing reality. Moreover, the application of such registration methods in indoor AR-GIS systems is further impeded by the limited capacity of these systems to detect the geometry and semantic information in indoor environments. In this paper, we propose a novel method for fusing virtual objects and indoor scenes, based on indoor scene recognition technology. To accomplish scene fusion in AR-GIS, we first detect key points in reference images. Then, we perform interior layout extraction using a Fully Connected Networks (FCN) algorithm to acquire layout coordinate points for the tracking targets. We detect and recognize the target scene in a video frame image to track targets and estimate the camera pose. In this method, virtual 3D objects are fused precisely to a real scene, according to the camera pose and the previously extracted layout coordinate points. Our results demonstrate that this approach enables accurate fusion of virtual objects with representations of real world indoor environments. Based on this fusion technique, users can better grasp virtual three-dimensional representations on an AR-GIS platform. View Full-Text
Keywords: AR-GIS; FCN; mobile phone; pose tracking; scene fusing AR-GIS; FCN; mobile phone; pose tracking; scene fusing
Show Figures

Figure 1

MDPI and ACS Style

Ma, W.; Xiong, H.; Dai, X.; Zheng, X.; Zhou, Y. An Indoor Scene Recognition-Based 3D Registration Mechanism for Real-Time AR-GIS Visualization in Mobile Applications. ISPRS Int. J. Geo-Inf. 2018, 7, 112.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map

Back to TopTop