Next Article in Journal
Towards a Unified and Coherent Land Surface Temperature Earth System Data Record from Geostationary Satellites
Previous Article in Journal
Interaction of Seasonal Sun-Angle and Savanna Phenology Observed and Modelled using MODIS
Previous Article in Special Issue
A Methodology to Monitor Urban Expansion and Green Space Change Using a Time Series of Multi-Sensor SPOT and Sentinel-2A Images
Article Menu

Export Article

Open AccessArticle

A Hierarchical Urban Forest Index Using Street-Level Imagery and Deep Learning

1
Data Science Campus, Office for National Statistics, Newport NP10 8XG, UK
2
Geographic Data Science Lab, Department of Geography & Planning, University of Liverpool, Liverpool L69 7ZT, UK
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(12), 1395; https://doi.org/10.3390/rs11121395
Received: 29 April 2019 / Revised: 1 June 2019 / Accepted: 5 June 2019 / Published: 12 June 2019
(This article belongs to the Special Issue Remote Sensing of Urban Forests)
  |  
PDF [42084 KB, uploaded 12 June 2019]
  |  

Abstract

We develop a method based on computer vision and a hierarchical multilevel model to derive an Urban Street Tree Vegetation Index which aims to quantify the amount of vegetation visible from the point of view of a pedestrian. Our approach unfolds in two steps. First, areas of vegetation are detected within street-level imagery using a state-of-the-art deep neural network model. Second, information is combined from several images to derive an aggregated indicator at the area level using a hierarchical multilevel model. The comparative performance of our proposed approach is demonstrated against a widely used image segmentation technique based on a pre-labelled dataset. The approach is deployed to a real-world scenario for the city of Cardiff, Wales, using Google Street View imagery. Based on more than 200,000 street-level images, an urban tree street-level indicator is derived to measure the spatial distribution of tree cover, accounting for the presence of obstructing objects present in images at the Lower Layer Super Output Area (LSOA) level, corresponding to the most commonly used administrative areas for policy-making in the United Kingdom. The results show a high degree of correspondence between our tree street-level score and aerial tree cover estimates. They also evidence more accurate estimates at a pedestrian perspective from our tree score by more appropriately capturing tree cover in areas with large burial, woodland, formal open and informal open spaces where shallow trees are abundant, in high density residential areas with backyard trees, and along street networks with high density of high trees. The proposed approach is scalable and automatable. It can be applied to cities across the world and provides robust estimates of urban trees to advance our understanding of the link between mental health, well-being, green space and air pollution. View Full-Text
Keywords: urban forestry; green space; street-level imagery; deep learning; image segmentation urban forestry; green space; street-level imagery; deep learning; image segmentation
Figures

Graphical abstract

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Stubbings, P.; Peskett, J.; Rowe, F.; Arribas-Bel, D. A Hierarchical Urban Forest Index Using Street-Level Imagery and Deep Learning. Remote Sens. 2019, 11, 1395.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Remote Sens. EISSN 2072-4292 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top