Next Article in Journal
Comparing the Spatial Accuracy of Digital Surface Models from Four Unoccupied Aerial Systems: Photogrammetry Versus LiDAR
Next Article in Special Issue
Comparison of Machine-Learning Methods for Urban Land-Use Mapping in Hangzhou City, China
Previous Article in Journal
PWNet: An Adaptive Weight Network for the Fusion of Panchromatic and Multispectral Images
Previous Article in Special Issue
Exploring Annual Urban Expansions in the Guangdong-Hong Kong-Macau Greater Bay Area: Spatiotemporal Features and Driving Factors in 1986–2017
Article

Urban Building Type Mapping Using Geospatial Data: A Case Study of Beijing, China

1
Department of Geological and Atmospheric Sciences, Iowa State University, Ames, IA 50011, USA
2
Department of Geography, University of Tennessee, Knoxville, TN 37996, USA
3
Laboratory for Remote Sensing and Environmental Change (LRSEC), Department of Geography and Earth Sciences, University of North Carolina at Charlotte, Charlotte, NC 28223, USA
4
School of Remote Sensing and Information Engineering, Wuhan University, Wuhan 430079, China
5
State Key Laboratory of Information Engineering in Surveying, Mapping, and Remote Sensing, Wuhan University, Wuhan 430079, China
6
School of Geographic Sciences, East China Normal University, Shanghai 200241, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(17), 2805; https://doi.org/10.3390/rs12172805
Received: 2 July 2020 / Revised: 20 August 2020 / Accepted: 27 August 2020 / Published: 29 August 2020
(This article belongs to the Special Issue Urban Land Use Mapping and Analysis in the Big Data Era)
The information of building types is highly needed for urban planning and management, especially in high resolution building modeling in which buildings are the basic spatial unit. However, in many parts of the world, this information is still missing. In this paper, we proposed a framework to derive the information of building type using geospatial data, including point-of-interest (POI) data, building footprints, land use polygons, and roads, from Gaode and Baidu Maps. First, we used natural language processing (NLP)-based approaches (i.e., text similarity measurement and topic modeling) to automatically reclassify POI categories into which can be used to directly infer building types. Second, based on the relationship between building footprints and POIs, we identified building types using two indicators of type ratio and area ratio. The proposed framework was tested using over 440,000 building footprints in Beijing, China. Our NLP-based approaches and building type identification methods show overall accuracies of 89.0% and 78.2%, and kappa coefficient of 0.83 and 0.71, respectively. The proposed framework is transferrable to other China cities for deriving the information of building types from web mapping platforms. The data products generated from this study are of great use for quantitative urban studies at the building level. View Full-Text
Keywords: urban building type; point-of-interest data; POI; Beijing; natural language processing urban building type; point-of-interest data; POI; Beijing; natural language processing
Show Figures

Figure 1

MDPI and ACS Style

Chen, W.; Zhou, Y.; Wu, Q.; Chen, G.; Huang, X.; Yu, B. Urban Building Type Mapping Using Geospatial Data: A Case Study of Beijing, China. Remote Sens. 2020, 12, 2805. https://doi.org/10.3390/rs12172805

AMA Style

Chen W, Zhou Y, Wu Q, Chen G, Huang X, Yu B. Urban Building Type Mapping Using Geospatial Data: A Case Study of Beijing, China. Remote Sensing. 2020; 12(17):2805. https://doi.org/10.3390/rs12172805

Chicago/Turabian Style

Chen, Wei, Yuyu Zhou, Qiusheng Wu, Gang Chen, Xin Huang, and Bailang Yu. 2020. "Urban Building Type Mapping Using Geospatial Data: A Case Study of Beijing, China" Remote Sensing 12, no. 17: 2805. https://doi.org/10.3390/rs12172805

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop