Next Article in Journal
Pushing the Limits of Seagrass Remote Sensing in the Turbid Waters of Elkhorn Slough, California
Next Article in Special Issue
Remote Sensing 10th Anniversary Best Paper Award
Previous Article in Journal
Analytical Approximation Model for Quadratic Phase Error Introduced by Orbit Determination Errors in Real-Time Spaceborne SAR Imaging
Previous Article in Special Issue
Sensitivity of Seven MODIS Vegetation Indices to BRDF Effects during the Amazonian Dry Season
Open AccessArticle

Comparing Human Versus Machine-Driven Cadastral Boundary Feature Extraction

1
Department of Geography and Urban Planning, School of Architecture and Built Environment (SABE), College of Science and Technology (CST), University of Rwanda, Kigali City B.P 3900, Rwanda
2
Faculty of Geo-Information Science and Earth Observation (ITC), University of Twente, 7500 AE Enschede, The Netherlands
3
Department of Business Technology and Entrepreneurship, Swinburne Business School, BA1231 Hawthorn campus, Melbourne, VIC 3122, Australia
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(14), 1662; https://doi.org/10.3390/rs11141662
Received: 27 May 2019 / Revised: 8 July 2019 / Accepted: 9 July 2019 / Published: 12 July 2019
(This article belongs to the Special Issue Remote Sensing: 10th Anniversary)
The objective to fast-track the mapping and registration of large numbers of unrecorded land rights globally has led to the experimental application of Artificial Intelligence in the domain of land administration, and specifically the application of automated visual cognition techniques for cadastral mapping tasks. In this research, we applied and compared the ability of rule-based systems within Object-Based Image Analysis (OBIA), as opposed to human analysis, to extract visible cadastral boundaries from very high-resolution World View-2 images, in both rural and urban settings. From our experiments, machine-based techniques were able to automatically delineate a good proportion of rural parcels with explicit polygons where the correctness of the automatically extracted boundaries was 47.4% against 74.24% for humans and the completeness of 45% for the machine compared to 70.4% for humans. On the contrary, in the urban area, automatic results were counterintuitive: even though urban plots and buildings are clearly marked with visible features such as fences, roads and tacitly perceptible to eyes, automation resulted in geometrically and topologically poorly structured data. Thus, these could neither be geometrically compared with human digitisation, nor actual cadastral data from the field. The results of this study provide an updated snapshot with regards to the performance of contemporary machine-driven feature extraction techniques compared to conventional manual digitising. In our methodology, using an iterative approach of segmentation and classification, we demonstrated how to overcome the weaknesses of having undesirable segments due to intra-parcel and inter-parcel variability, when using segmentation approaches for cadastral feature delineation. We also demonstrated how we can easily implement a geometric comparison framework within the Esri’s ArcGIS software environment and firmly believe the developed methodology can be reproduced. View Full-Text
Keywords: cadastral intelligence; manual digitisation; expert parameterisation; land administration; land management; automatic feature extraction; Object-Based Image Analysis cadastral intelligence; manual digitisation; expert parameterisation; land administration; land management; automatic feature extraction; Object-Based Image Analysis
Show Figures

Figure 1

MDPI and ACS Style

Nyandwi, E.; Koeva, M.; Kohli, D.; Bennett, R. Comparing Human Versus Machine-Driven Cadastral Boundary Feature Extraction. Remote Sens. 2019, 11, 1662.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop