Open AccessArticle
An Effective High-Performance Multiway Spatial Join Algorithm with Spark
ISPRS Int. J. Geo-Inf. 2017, 6(4), 96; doi:10.3390/ijgi6040096 -
Abstract
Multiway spatial join plays an important role in GIS (Geographic Information Systems) and their applications. With the increase in spatial data volumes, the performance of multiway spatial join has encountered a computation bottleneck in the context of big data. Parallel or distributed computing
[...] Read more.
Multiway spatial join plays an important role in GIS (Geographic Information Systems) and their applications. With the increase in spatial data volumes, the performance of multiway spatial join has encountered a computation bottleneck in the context of big data. Parallel or distributed computing platforms, such as MapReduce and Spark, are promising for resolving the intensive computing issue. Previous approaches have focused on developing single-threaded join algorithms as an optimizing and partition strategy for parallel computing. In this paper, we present an effective high-performance multiway spatial join algorithm with Spark (MSJS) to overcome the multiway spatial join bottleneck. MSJS handles the problem through cascaded pairwise join. Using the power of Spark, the formerly inefficient cascaded pairwise spatial join is transformed into a high-performance approach. Experiments using massive real-world data sets prove that MSJS outperforms existing parallel approaches of multiway spatial join that have been described in the literature. Full article
Figures

Figure 1

Open AccessArticle
A Procedural Construction Method for Interactive Map Symbols Used for Disasters and Emergency Response
ISPRS Int. J. Geo-Inf. 2017, 6(4), 95; doi:10.3390/ijgi6040095 -
Abstract
The timely and accurate mapping of dynamic disasters and emergencies is an important task that is necessary for supporting the decision-making that can improve the efficiency of rescue and response efforts. The existing emergency symbol libraries are primarily composed of point symbols and
[...] Read more.
The timely and accurate mapping of dynamic disasters and emergencies is an important task that is necessary for supporting the decision-making that can improve the efficiency of rescue and response efforts. The existing emergency symbol libraries are primarily composed of point symbols and simple line symbols, focusing on the representation of disasters, related facilities, and operations. However, various existing response factors (e.g., the distribution and types of emergency forces) are also important for further decision-making and emergency responses; there is a need to design complex and diverse symbols to represent this rich information. Moreover, traditional mapping systems only provide static map symbols that cannot be easily edited after creation, making it difficult to support interactive editing after the symbols are mapped, thus hindering the representation of dynamic disasters and response factors. This article targets a solution of the above issues by proposing a procedural construction method of interactive map symbols for dynamic disasters and emergency responses. There are two primary research points. First, an emergency response and decision symbol library was classified and integrated into the existing attachments to form a richer symbol library for comprehensively representing disasters and emergencies. Second, an interactive map symbol procedural construction method was designed based on (1) primitive geometric compositions and geometric graphics algorithms to construct the map symbol graphics; (2) an interactive graphics control and drawing attributes configuration method to support user interactive editing of the visual variables of the mapped symbols; (3) and a dynamic updating and drawing strategy to support the real-time refreshing of the changing visual variables. The experiment was conducted using the Wenchuan earthquake as a case study, and the results demonstrate a powerful capacity of the produced interactive map symbols, which will contribute to the improvement of the mapping efficiency and representation capability of disasters and emergency response. Full article
Figures

Figure 1

Open AccessArticle
Integration of GIS and Moving Objects in Surveillance Video
ISPRS Int. J. Geo-Inf. 2017, 6(4), 94; doi:10.3390/ijgi6040094 -
Abstract
This paper discusses the integration of a geographic information system (GIS) and moving objects in surveillance videos (“moving objects” hereinafter) by using motion detection, spatial mapping, and fusion representation techniques. This integration aims to overcome the limitations of conventional video surveillance systems, such
[...] Read more.
This paper discusses the integration of a geographic information system (GIS) and moving objects in surveillance videos (“moving objects” hereinafter) by using motion detection, spatial mapping, and fusion representation techniques. This integration aims to overcome the limitations of conventional video surveillance systems, such as low efficiency in video searching, redundancy in video data transmission, and insufficient capability to position video content in geographic space. Furthermore, a model for integrating GIS and moving objects is established. The model includes a moving object extraction method and a fusion pattern for GIS and moving objects. From the established integration model, a prototype of GIS and moving objects (GIS–MOV) system is constructed and used to analyze the possible applications of the integration of GIS and moving objects. Full article
Figures

Figure 1

Open AccessArticle
A Lightweight CUDA-Based Parallel Map Reprojection Method for Raster Datasets of Continental to Global Extent
ISPRS Int. J. Geo-Inf. 2017, 6(4), 92; doi:10.3390/ijgi6040092 -
Abstract
Geospatial transformations in the form of reprojection calculations for large datasets can be computationally intensive; as such, finding better, less expensive ways of achieving these computations is desired. In this paper, we report our efforts in developing a Compute Unified Device Architecture (CUDA)-based
[...] Read more.
Geospatial transformations in the form of reprojection calculations for large datasets can be computationally intensive; as such, finding better, less expensive ways of achieving these computations is desired. In this paper, we report our efforts in developing a Compute Unified Device Architecture (CUDA)-based parallel algorithm to perform map reprojections for raster datasets on personal computers using Graphics Processing Units (GPUs). This algorithm has two unique features: a) an output-space-based parallel processing strategy to handle transformations more rigorously, and b) a chunk-based data decomposition method for projected space in conjunction with an on-the-fly data retrieval mechanism to avoid memory overflow. To demonstrate the performance of our CUDA-based map reprojection approaches, we have conducted tests between this method and the traditional serial version using the Central Processing Unit (CPU). The results show that speedup ratios range from 10 times to 100 times in all test scenarios. The lessons learned from the tests are summarized. Full article
Figures

Figure 1

Open AccessArticle
A Novel k-Means Clustering Based Task Decomposition Method for Distributed Vector-Based CA Models
ISPRS Int. J. Geo-Inf. 2017, 6(4), 93; doi:10.3390/ijgi6040093 -
Abstract
More and more vector-based cellular automata (VCA) models have been built to leverage parallel computing to model rapidly changing cities and urban regions. During parallel simulation, common task decomposition methods based on space partitioning, e.g. grid partitioning (GRID) and recursive binary space partitioning
[...] Read more.
More and more vector-based cellular automata (VCA) models have been built to leverage parallel computing to model rapidly changing cities and urban regions. During parallel simulation, common task decomposition methods based on space partitioning, e.g. grid partitioning (GRID) and recursive binary space partitioning (BSP), do not work well given the heterogeneity of VCA parcel tasks. In this paper, to solve this problem, we propose a novel task decomposition method for distributed VCA models based on k-means clustering, named KCP. Firstly, the polygon dataset is converted into points based on centroids, which combines the size of two parcels and the outer distance. A low-cost recursive quad-partition is then applied to decide the initial cluster centers based on parcel density. Finally, neighbor parcels can be allocated into the same subdivision through k-means clustering. As a result, the proposed KCP method takes both the number of tasks and computing complexity into consideration to achieve a well-balanced local workload. A typical urban VCA growth model was designed to evaluate the proposed KCP method with traditional spatial partitioning methods, i.e. GRID and BSP. KCP had the shortest total simulation time when compared with GRID and BSP. During experimental urban growth simulations, the time spent on a single iteration was reduced by 15% with the BSP and by 25% with the GRID method. The total simulation time with a 120 m neighborhood buffer size was reduced by more than one hour to around three minutes with 32 cores. Full article
Figures

Open AccessArticle
Efficient Geometric Pruning Strategies for Continuous Skyline Queries
ISPRS Int. J. Geo-Inf. 2017, 6(3), 91; doi:10.3390/ijgi6030091 -
Abstract
The skyline query processing problem has been well studied for many years. The literature on skyline algorithms so far mainly considers static query points on static attributes. With the popular usage of mobile devices along with the increasing number of mobile applications and
[...] Read more.
The skyline query processing problem has been well studied for many years. The literature on skyline algorithms so far mainly considers static query points on static attributes. With the popular usage of mobile devices along with the increasing number of mobile applications and users, continuous skyline query processing on both static and dynamic attributes has become more pressing. Existing efforts on supporting moving query points assume that the query point moves with only one direction and constant speed. In this paper, we propose continuous skyline computation over an incremental motion model. The query point moves incrementally in discrete time steps with no restrictions and predictability. Geometric properties over incremental motion denoted by a kinetic data structure are utilized to prune the portion of data points not included in final skyline query results. Various geometric strategies are asymptotically proposed to prune the querying dataset, and event-driven mechanisms are adopted to process continuous skyline queries. Extensive experiments under different data sets and parameters demonstrate that the proposed method is robust and more efficient than multiple snapshots of I/O optimal branch-and-bound skyline (BBS) skyline queries. Full article
Figures

Figure 1

Open AccessArticle
Estimation of 3D Indoor Models with Constraint Propagation and Stochastic Reasoning in the Absence of Indoor Measurements
ISPRS Int. J. Geo-Inf. 2017, 6(3), 90; doi:10.3390/ijgi6030090 -
Abstract
This paper presents a novel method for the prediction of building floor plans based on sparse observations in the absence of measurements. We derive the most likely hypothesis using a maximum a posteriori probability approach. Background knowledge consisting of probability density functions of
[...] Read more.
This paper presents a novel method for the prediction of building floor plans based on sparse observations in the absence of measurements. We derive the most likely hypothesis using a maximum a posteriori probability approach. Background knowledge consisting of probability density functions of room shape and location parameters is learned from training data. Relations between rooms and room substructures are represented by linear and bilinear constraints. We perform reasoning on different levels providing a problem solution that is optimal with regard to the given information. In a first step, the problem is modeled as a constraint satisfaction problem. Constraint Logic Programming derives a solution which is topologically correct but suboptimal with regard to the geometric parameters. The search space is reduced using architectural constraints and browsed by intelligent search strategies which use domain knowledge. In a second step, graphical models are used for updating the initial hypothesis and refining its continuous parameters. We make use of Gaussian mixtures for model parameters in order to represent background knowledge and to get access to established methods for efficient and exact stochastic reasoning. We demonstrate our approach on different illustrative examples. Initially, we assume that floor plans are rectangular and that rooms are rectangles and discuss more general shapes afterwards. In a similar spirit, we predict door locations providing further important components of 3D indoor models. Full article
Figures

Figure 1

Open AccessArticle
Implementation of Algorithm for Satellite-Derived Bathymetry using Open Source GIS and Evaluation for Tsunami Simulation
ISPRS Int. J. Geo-Inf. 2017, 6(3), 89; doi:10.3390/ijgi6030089 -
Abstract
Accurate and high resolution bathymetric data is a necessity for a wide range of coastal oceanographic research topics. Active sensing methods, such as ship-based soundings and Light Detection and Ranging (LiDAR), are expensive and time consuming solutions. Therefore, the significance of Satellite-Derived Bathymetry
[...] Read more.
Accurate and high resolution bathymetric data is a necessity for a wide range of coastal oceanographic research topics. Active sensing methods, such as ship-based soundings and Light Detection and Ranging (LiDAR), are expensive and time consuming solutions. Therefore, the significance of Satellite-Derived Bathymetry (SDB) has increased in the last ten years due to the availability of multi-constellation, multi-temporal, and multi-resolution remote sensing data as Open Data. Effective SDB algorithms have been proposed by many authors, but there is no ready-to-use software module available in the Geographical Information System (GIS) environment as yet. Hence, this study implements a Geographically Weighted Regression (GWR) based SDB workflow as a Geographic Resources Analysis Support System (GRASS) GIS module (i.image.bathymetry). Several case studies were carried out to examine the performance of the module in multi-constellation and multi-resolution satellite imageries for different study areas. The results indicate a strong correlation between SDB and reference depth. For instance, case study 1 (Puerto Rico, Northeastern Caribbean Sea) has shown an coefficient of determination (R2) of 0.98 and an Root Mean Square Error (RMSE) of 0.61 m, case study 2 (Iwate, Japan) has shown an R2 of 0.94 and an RMSE of 1.50 m, and case study 3 (Miyagi, Japan) has shown an R2 of 0.93 and an RMSE of 1.65 m. The reference depths were acquired by using LiDAR for case study 1 and an echo-sounder for case studies 2 and 3. Further, the estimated SDB has been used as one of the inputs for the Australian National University and Geoscience Australia (ANUGA) tsunami simulation model. The tsunami simulation results also show close agreement with post-tsunami survey data. The i.mage.bathymetry module developed as a part of this study is made available as an extension for the Open Source GRASS GIS to facilitate wide use and future improvements. Full article
Figures

Figure 1

Open AccessArticle
Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data
ISPRS Int. J. Geo-Inf. 2017, 6(3), 88; doi:10.3390/ijgi6030088 -
Abstract
Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the
[...] Read more.
Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation) and a social event (Pope Francis’ visit to the US) in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information), this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information. Full article
Figures

Figure 1

Open AccessArticle
Distributed Temperature Measurement in a Self-Burning Coal Waste Pile through a GIS Open Source Desktop Application
ISPRS Int. J. Geo-Inf. 2017, 6(3), 87; doi:10.3390/ijgi6030087 -
Abstract
Geographical Information Systems (GIS) are often used to assess and monitor the environmental impacts caused by mining activities. The aim of this work was to develop a new application to produce dynamic maps for monitoring the temperature variations in a self-burning coal waste
[...] Read more.
Geographical Information Systems (GIS) are often used to assess and monitor the environmental impacts caused by mining activities. The aim of this work was to develop a new application to produce dynamic maps for monitoring the temperature variations in a self-burning coal waste pile, under a GIS open source environment—GIS-ECOAL (freely available). The performance of the application was evaluated with distributed temperature measurements gathered in the S. Pedro da Cova (Portugal) coal waste pile. In order to obtain the temperature data, an optical fiber cable was disposed over the affected area of the pile, with 42 location stakes acting as precisely-located control points for the temperature measurement. A monthly data set from July (15 min of interval) was fed into the application and a video composed by several layouts with temperature measurements was created allowing for recognizing two main areas with higher temperatures. The field observations also allow the identification of these zones; however, the identification of an area with higher temperatures in the top of the studied area was only possible through the visualization of the images created by this application. The generated videos make possible the dynamic and continuous visualization of the combustion process in the monitored area. Full article
Figures

Figure 1

Open AccessArticle
User-Generated Geographic Information for Visitor Monitoring in a National Park: A Comparison of Social Media Data and Visitor Survey
ISPRS Int. J. Geo-Inf. 2017, 6(3), 85; doi:10.3390/ijgi6030085 -
Abstract
Protected area management and marketing require real-time information on visitors’ behavior and preferences. Thus far, visitor information has been collected mostly with repeated visitor surveys. A wealth of content-rich geographic data is produced by users of different social media platforms. These data could
[...] Read more.
Protected area management and marketing require real-time information on visitors’ behavior and preferences. Thus far, visitor information has been collected mostly with repeated visitor surveys. A wealth of content-rich geographic data is produced by users of different social media platforms. These data could potentially provide continuous information about people’s activities and interactions with the environment at different spatial and temporal scales. In this paper, we compare social media data with traditional survey data in order to map people’s activities and preferences using the most popular national park in Finland, Pallas-Yllästunturi National Park, as a case study. We compare systematically collected survey data and the content of geotagged social media data and analyze: (i) where do people go within the park; (ii) what are their activities; (iii) when do people visit the park and if there are temporal patterns in their activities; (iv) who the visitors are; (v) why people visit the national park; and (vi) what complementary information from social media can provide in addition to the results from traditional surveys. The comparison of survey and social media data demonstrated that geotagged social media content provides relevant information about visitors’ use of the national park. As social media platforms are a dynamic source of data, they could complement and enrich traditional forms of visitor monitoring by providing more insight on emerging activities, temporal patterns of shared content, and mobility patterns of visitors. Potentially, geotagged social media data could also provide an overview of the spatio-temporal activity patterns in other areas where systematic visitor monitoring is not taking place. Full article
Figures

Figure 1

Open AccessArticle
Integration of Traffic Information into the Path Planning among Moving Obstacles
ISPRS Int. J. Geo-Inf. 2017, 6(3), 86; doi:10.3390/ijgi6030086 -
Abstract
This paper investigates the integration of traffic information (TI) into the routing in the presence of moving obstacles. When traffic accidents occur, the incidents could generate different kinds of hazards (e.g., toxic plumes), which make certain parts of the road network inaccessible. On
[...] Read more.
This paper investigates the integration of traffic information (TI) into the routing in the presence of moving obstacles. When traffic accidents occur, the incidents could generate different kinds of hazards (e.g., toxic plumes), which make certain parts of the road network inaccessible. On the other hand, the first responders, who are responsible for management of the traffic incidents, need to be fast and safely guided to the incident place. To support navigation in the traffic network affected by moving obstacles, in this paper, we provide a spatio-temporal data model to structure the information of traffic conditions that is essential for the routing, and present an extended path planning algorithm, named MOAAstar–TI (Moving Obstacle Avoiding A* using Traffic Information), to generate routes avoiding the obstacles. A speed adjustment factor is introduced in the developed routing algorithm, allowing integration of both the information of vehicles and traffic situations to generate routes avoiding the moving obstacles caused by the incidents. We applied our system to a set of navigation scenarios. The application results show the potentials of our system in future application in real life. Full article
Figures

Figure 1

Open AccessArticle
A Dynamic Data Structure to Efficiently Find the Points below a Line and Estimate Their Number
ISPRS Int. J. Geo-Inf. 2017, 6(3), 82; doi:10.3390/ijgi6030082 -
Abstract
A basic question in computational geometry is how to find the relationship between a set of points and a line in a real plane. In this paper, we present multidimensional data structures for N points that allow answering the following queries for any
[...] Read more.
A basic question in computational geometry is how to find the relationship between a set of points and a line in a real plane. In this paper, we present multidimensional data structures for N points that allow answering the following queries for any given input line: (1) estimate in O(logN) time the number of points below the line; (2) return in O(logN+k) time the kN points that are below the line; and (3) return in O(logN) time the point that is closest to the line. We illustrate the utility of this computational question with GIS applications in air defense and traffic control. Full article
Figures

Figure 1

Open AccessArticle
A GIS-Based Evaluation of the Effectiveness and Spatial Coverage of Public Transport Networks in Tourist Destinations
ISPRS Int. J. Geo-Inf. 2017, 6(3), 83; doi:10.3390/ijgi6030083 -
Abstract
This article develops a methodology for evaluating the effectiveness and spatial coverage of public transport in tourist cities. The proposed methodology is applied and validated in Cambrils municipality, in the central part of the Costa Daurada in Catalonia, a coastal destination characterised by
[...] Read more.
This article develops a methodology for evaluating the effectiveness and spatial coverage of public transport in tourist cities. The proposed methodology is applied and validated in Cambrils municipality, in the central part of the Costa Daurada in Catalonia, a coastal destination characterised by the concentration of tourism flows during summer. The application of GIS spatial analysis tools allows for the development of a system of territorial indicators that spatially correlate the public transport network and the distribution of the population. The main novelty of our work is that this analysis not only includes the registered resident population, but also incorporates the population that temporarily inhabits the municipality (tourists). The results of the study firstly permit the detection of unequal spatial accessibility and coverage in terms of public transport in the municipality, with significant differences between central neighbourhoods and peripheral urban areas of lower population density. Secondly, they allow observation of how the degree of public transport coverage differs significantly in areas with a higher concentration of tourist accommodation establishments. Full article
Figures

Figure 1

Open AccessArticle
Elastic Spatial Query Processing in OpenStack Cloud Computing Environment for Time-Constraint Data Analysis
ISPRS Int. J. Geo-Inf. 2017, 6(3), 84; doi:10.3390/ijgi6030084 -
Abstract
Geospatial big data analysis (GBDA) is extremely significant for time-constraint applications such as disaster response. However, the time-constraint analysis is not yet a trivial task in the cloud computing environment. Spatial query processing (SQP) is typical computation-intensive and indispensable for GBDA, and the
[...] Read more.
Geospatial big data analysis (GBDA) is extremely significant for time-constraint applications such as disaster response. However, the time-constraint analysis is not yet a trivial task in the cloud computing environment. Spatial query processing (SQP) is typical computation-intensive and indispensable for GBDA, and the spatial range query, join query, and the nearest neighbor query algorithms are not scalable without using MapReduce-liked frameworks. Parallel SQP algorithms (PSQPAs) are trapped in screw-processing, which is a known issue in Geoscience. To satisfy time-constrained GBDA, we propose an elastic SQP approach in this paper. First, Spark is used to implement PSQPAs. Second, Kubernetes-managed Core Operation System (CoreOS) clusters provide self-healing Docker containers for running Spark clusters in the cloud. Spark-based PSQPAs are submitted to Docker containers, where Spark master instances reside. Finally, the horizontal pod auto-scaler (HPA) would scale-out and scale-in Docker containers for supporting on-demand computing resources. Combined with an auto-scaling group of virtual instances, HPA helps to find each of the five nearest neighbors for 46,139,532 query objects from 834,158 spatial data objects in less than 300 s. The experiments conducted on an OpenStack cloud demonstrate that auto-scaling containers can satisfy time-constraint GBDA in clouds. Full article
Figures

Figure 1

Open AccessArticle
Assessing Crowdsourced POI Quality: Combining Methods Based on Reference Data, History, and Spatial Relations
ISPRS Int. J. Geo-Inf. 2017, 6(3), 80; doi:10.3390/ijgi6030080 -
Abstract
With the development of location-aware devices and the success and high use of Web 2.0 techniques, citizens are able to act as sensors by contributing geographic information. In this context, data quality is an important aspect that should be taken into account when
[...] Read more.
With the development of location-aware devices and the success and high use of Web 2.0 techniques, citizens are able to act as sensors by contributing geographic information. In this context, data quality is an important aspect that should be taken into account when using this source of data for different purposes. The goal of the paper is to analyze the quality of crowdsourced data and to study its evolution over time. We propose two types of approaches: (1) use the intrinsic characteristics of the crowdsourced datasets; or (2) evaluate crowdsourced Points of Interest (POIs) using external datasets (i.e., authoritative reference or other crowdsourced datasets), and two different methods for each approach. The potential of the combination of these approaches is then demonstrated, to overcome the limitations associated with each individual method. In this paper, we focus on POIs and places coming from the very successful crowdsourcing project: OpenStreetMap. The results show that the proposed approaches are complementary in assessing data quality. The positive results obtained for data matching show that the analysis of data quality through automatic data matching is possible but considerable effort and attention are needed for schema matching given the heterogeneity of OSM and the representation of authoritative datasets. For the features studied, it can be noted that change over time is sometimes due to disagreements between contributors, but in most cases the change improves the quality of the data. Full article
Figures

Figure 1

Open AccessArticle
Assessment of Wetland Ecosystem Health in the Yangtze and Amazon River Basins
ISPRS Int. J. Geo-Inf. 2017, 6(3), 81; doi:10.3390/ijgi6030081 -
Abstract
As “kidneys of the earth”, wetlands play an important role in ameliorating weather conditions, flood storage, and the control and reduction of environmental pollution. With the development of local economies, the wetlands in both the Amazon and Yangtze River Basins have been affected
[...] Read more.
As “kidneys of the earth”, wetlands play an important role in ameliorating weather conditions, flood storage, and the control and reduction of environmental pollution. With the development of local economies, the wetlands in both the Amazon and Yangtze River Basins have been affected and threatened by human activities, such as urban expansion, reclamation of land from lakes, land degradation, and large-scale agricultural development. It is necessary and important to develop a wetland ecosystem health evaluation model and to quantitatively evaluate the wetland ecosystem health in these two basins. In this paper, GlobeLand30 land cover maps and socio-economic and climate data from 2000 and 2010 were adopted to assess the wetland ecosystem health of the Yangtze and Amazon River Basins on the basis of a pressure-state-response (PSR) model. A total of 13 indicators were selected to build the wetland health assessment system. Weights of these indicators and PSR model components, as well as normalized wetland health scores, were assigned and calculated based on the analytic hierarchy process method. The results showed that from 2000 to 2010, the value of the mean wetland ecosystem health index in the Yangtze River Basin decreased from 0.482 to 0.481, while it increased from 0.582 to 0.593 in the Amazon River Basin. This indicated that the average status of wetland ecosystem health in the Amazon River Basin is better than that in the Yangtze River Basin, and that wetland health improved over time in the Amazon River Basin but worsened in the Yangtze River Basin. Full article
Figures

Figure 1

Open AccessArticle
An Integrated Approach for Monitoring and Information Management of the Guanling Landslide (China)
ISPRS Int. J. Geo-Inf. 2017, 6(3), 79; doi:10.3390/ijgi6030079 -
Abstract
Landslide triggered by earthquake or rainstorm often results in serious property damage and human casualties. It is, therefore, necessary to establish an emergency management system to facilitate the processes of damage assessment and decision-making. This paper has presented an integrated approach for mapping
[...] Read more.
Landslide triggered by earthquake or rainstorm often results in serious property damage and human casualties. It is, therefore, necessary to establish an emergency management system to facilitate the processes of damage assessment and decision-making. This paper has presented an integrated approach for mapping and analyzing spatial features of a landslide from remote sensing images and Digital Elevation Models (DEMs). Several image interpretation tools have been provided for analyzing the spatial distribution and characteristics of the landslide on different dimensions: (1D) terrain variation analysis along the mass movement direction and (3D) morphological analysis. In addition, the results of image interpretation can be further discussed and adjusted on an online cooperating platform, which was built to improve the coordination of all players involved in different phases of emergency management, e.g., hazard experts, emergency managers, and first response organizations. A mobile-based application has also been developed to enhance the data exchange and on-site investigation. Our pilot study of Guanling landslide shows that the presented approach has the potential to facilitate the phases of landslide monitoring and information management, e.g., hazard assessment, emergency preparedness, planning mitigation, and response. Full article
Figures

Figure 1

Open AccessArticle
On Data Quality Assurance and Conflation Entanglement in Crowdsourcing for Environmental Studies
ISPRS Int. J. Geo-Inf. 2017, 6(3), 78; doi:10.3390/ijgi6030078 -
Abstract
Volunteer geographical information (VGI), either in the context of citizen science or the mining of social media, has proven to be useful in various domains including natural hazards, health status, disease epidemics, and biological monitoring. Nonetheless, the variable or unknown data quality due
[...] Read more.
Volunteer geographical information (VGI), either in the context of citizen science or the mining of social media, has proven to be useful in various domains including natural hazards, health status, disease epidemics, and biological monitoring. Nonetheless, the variable or unknown data quality due to crowdsourcing settings are still an obstacle for fully integrating these data sources in environmental studies and potentially in policy making. The data curation process, in which a quality assurance (QA) is needed, is often driven by the direct usability of the data collected within a data conflation process or data fusion (DCDF), combining the crowdsourced data into one view, using potentially other data sources as well. Looking at current practices in VGI data quality and using two examples, namely land cover validation and inundation extent estimation, this paper discusses the close links between QA and DCDF. It aims to help in deciding whether a disentanglement can be possible, whether beneficial or not, in understanding the data curation process with respect to its methodology for future usage of crowdsourced data. Analysing situations throughout the data curation process where and when entanglement between QA and DCDF occur, the paper explores the various facets of VGI data capture, as well as data quality assessment and purposes. Far from rejecting the usability ISO quality criterion, the paper advocates for a decoupling of the QA process and the DCDF step as much as possible while still integrating them within an approach analogous to a Bayesian paradigm. Full article
Figures

Figure 1

Open AccessArticle
Applicability Analysis of VTEC Derived from the Sophisticated Klobuchar Model in China
ISPRS Int. J. Geo-Inf. 2017, 6(3), 75; doi:10.3390/ijgi6030075 -
Abstract
Although the Klobuchar model is widely used in single-frequency GPS receivers, it cannot effectively correct the ionospheric delay. The Klobuchar model sets the night ionospheric delay as a constant, i.e., it cannot reflect temporal changes at night. The observation data of seventeen International
[...] Read more.
Although the Klobuchar model is widely used in single-frequency GPS receivers, it cannot effectively correct the ionospheric delay. The Klobuchar model sets the night ionospheric delay as a constant, i.e., it cannot reflect temporal changes at night. The observation data of seventeen International Global Navigation Satellite System Service (IGS) stations within and around China from 2011 provided by the IGS center are used in this study to calculate the Total Electron Content (TEC) values using the Klobuchar model and the dual-frequency model. The Holt–Winters exponential smoothing model is used to forecast the error of the 7th day between the Klobuchar model and the dual-frequency model by using the error of the former six days. The forecast results are used to develop the sophisticated Klobuchar model when no epochs are missing, considering that certain reasons may result in some of the observation data being missing and weaken the relationship between each epoch in practical applications. We study the applicability of the sophisticated Klobuchar model when observation data are missing. This study deletes observation data of some epochs randomly and then calculates TEC values using the Klobuchar model. A cubic spline curve is used to restore the missing TEC values calculated in the Klobuchar mode. Finally, we develop the sophisticated Klobuchar model when N epochs are missing in China. The sophisticated Klobuchar model is compared with the dual-frequency model. The experimental results reveal the following: (1) the sophisticated Klobuchar model can correct the ionospheric delay more significantly than the Klobuchar model; (2) the sophisticated Klobuchar model can reflect the ionosphere temporal evolution, particularly at night, with the correct results increasing with increasing latitude; and (3) the sophisticated Klobuchar model can achieve remarkable correction results when N epochs are missing, with the correct results being nearly as good as that of the dual-frequency model when no epochs are missing. Full article
Figures

Figure 1