Next Article in Journal
LDAS-Monde Sequential Assimilation of Satellite Derived Observations Applied to the Contiguous US: An ERA-5 Driven Reanalysis of the Land Surface Variables
Previous Article in Journal
Comments on “Wind Gust Detection and Impact Prediction for Wind Turbines”
Article Menu
Issue 10 (October) cover image

Export Article

Open AccessArticle
Remote Sens. 2018, 10(10), 1626; https://doi.org/10.3390/rs10101626

Towards Operational Satellite-Based Damage-Mapping Using U-Net Convolutional Network: A Case Study of 2011 Tohoku Earthquake-Tsunami

International Research Institute of Disaster Science, Tohoku University, Aoba 468-1-E301, Aramaki, Aoba-ku, Sendai 980-8572, Japan
*
Author to whom correspondence should be addressed.
Received: 1 September 2018 / Revised: 3 October 2018 / Accepted: 5 October 2018 / Published: 12 October 2018
(This article belongs to the Section Remote Sensing Image Processing)
Full-Text   |   PDF [12399 KB, uploaded 22 October 2018]   |  

Abstract

The satellite remote-sensing-based damage-mapping technique has played an indispensable role in rapid disaster response practice, whereas the current disaster response practice remains subject to the low damage assessment accuracy and lag in timeliness, which dramatically reduces the significance and feasibility of extending the present method to practical operational applications. Therefore, a highly efficient and intelligent remote-sensing image-processing framework is urgently required to mitigate these challenges. In this article, a deep learning algorithm for the semantic segmentation of high-resolution remote-sensing images using the U-net convolutional network was proposed to map the damage rapidly. The algorithm was implemented within a Microsoft Cognitive Toolkit framework in the GeoAI platform provided by Microsoft. The study takes the 2011 Tohoku Earthquake-Tsunami as a case study, for which the pre- and post-disaster high-resolution WorldView-2 image is used. The performance of the proposed U-net model is compared with that of deep residual U-net. The comparison highlights the superiority U-net for tsunami damage mapping in this work. Our proposed method achieves the overall accuracy of 70.9% in classifying the damage into “washed away,” “collapsed,” and “survived” at the pixel level. In future disaster scenarios, our proposed model can generate the damage map in approximately 2–15 min when the preprocessed remote-sensing datasets are available. Our proposed damage-mapping framework has significantly improved the application value in operational disaster response practice by substantially reducing the manual operation steps required in the actual disaster response. Besides, the proposed framework is highly flexible to extend to other scenarios and various disaster types, which can accelerate operational disaster response practice. View Full-Text
Keywords: semantic segmentation; U-net convolutional neural network; operational damage-mapping; 2011 Tohoku earthquake and tsunami; Microsoft Cognitive Toolkit semantic segmentation; U-net convolutional neural network; operational damage-mapping; 2011 Tohoku earthquake and tsunami; Microsoft Cognitive Toolkit
Figures

Graphical abstract

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Bai, Y.; Mas, E.; Koshimura, S. Towards Operational Satellite-Based Damage-Mapping Using U-Net Convolutional Network: A Case Study of 2011 Tohoku Earthquake-Tsunami. Remote Sens. 2018, 10, 1626.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Remote Sens. EISSN 2072-4292 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top