Open AccessArticle
Evolving Spatial Data Infrastructures and the Role of Adaptive Governance
ISPRS Int. J. Geo-Inf. 2017, 6(8), 254; doi:10.3390/ijgi6080254 (registering DOI) -
Abstract
Spatial data infrastructures (SDIs) are becoming more mature worldwide. However, despite this growing maturity, longitudinal research on the governance of SDIs is rare. The current research examines the governance history of two SDIs in the Netherlands and Flanders (Belgium). Both represent decades-long undertakings
[...] Read more.
Spatial data infrastructures (SDIs) are becoming more mature worldwide. However, despite this growing maturity, longitudinal research on the governance of SDIs is rare. The current research examines the governance history of two SDIs in the Netherlands and Flanders (Belgium). Both represent decades-long undertakings to create a large-scale base map. During these processes, SDI governance changed, often quite radically. We analyse written accounts from geo-information industry magazines to determine if the SDI governance of these two base maps can be considered adaptive. We conclude that SDI governance was adaptive, as it changed considerably during the evolution of the two SDIs. However, we also find that most governance models did not hold up very long, as they were either not meeting their goals, were not satisfying all stakeholders or were not in alignment with new visions and ideas. In recent years, the policy instruments governing these base maps became increasingly diverse. In particular, more hierarchical instruments were introduced. Indeed, governance scholars increasingly agree that governance can better respond to changes when a broader mix of policy instruments is applied. Alas, this does not make SDI governance any less complex. Full article
Open AccessArticle
Estimation of Travel Time Distributions in Urban Road Networks Using Low-Frequency Floating Car Data
ISPRS Int. J. Geo-Inf. 2017, 6(8), 253; doi:10.3390/ijgi6080253 (registering DOI) -
Abstract
Travel times in urban road networks are highly stochastic. However, most existing travel time estimation methods only estimate the mean travel times, while ignoring travel time variances. To this end, this paper proposes a robust travel time distribution estimation method to estimate both
[...] Read more.
Travel times in urban road networks are highly stochastic. However, most existing travel time estimation methods only estimate the mean travel times, while ignoring travel time variances. To this end, this paper proposes a robust travel time distribution estimation method to estimate both the mean and variance of travel times by using emerging low-frequency floating car data. Different from the existing studies, the path travel time distribution in this study is formulated as the sum of the deterministic link travel times and stochastic turning delays at intersections. Using this formulation, distinct travel time delays for different turning movements at the same intersection can be well captured. In this study, a speed estimation algorithm is developed to estimate the deterministic link travel times, and a distribution estimation algorithm is proposed to estimate the stochastic turning delays. Considering the low sampling rate of the floating car data, a weighted moving average algorithm is further developed for a robust estimation of the path travel time distribution. A real-world case study in Wuhan, China is carried out to validate the applicability of the proposed method. The results of the case study show that the proposed method can obtain a reliable and accurate estimation of path travel time distribution in congested urban road networks. Full article
Figures

Figure 1

Open AccessArticle
Template Matching and Simplification Method for Building Features Based on Shape Cognition
ISPRS Int. J. Geo-Inf. 2017, 6(8), 250; doi:10.3390/ijgi6080250 -
Abstract
This study proposes a template matching simplification method from the perspective of shape cognition based on the typical template characteristics of building distributions and representations. The method first formulates a series of templates to abstract the building shape by generalizing their polygons and
[...] Read more.
This study proposes a template matching simplification method from the perspective of shape cognition based on the typical template characteristics of building distributions and representations. The method first formulates a series of templates to abstract the building shape by generalizing their polygons and analyzing their symbolic meanings, then conducts the simplification by searching and matching the most similar template that can be used later to replace the original building. On the premise of satisfying the individual geometric accuracy on a smaller scale, the proposed method can enhance the impression of well-known landmarks and reflect the pattern in mapping areas by the symbolic template. The turning function that describes shape by measuring the changes of the tangent-angle as a function of the arc-length is employed to obtain the similar distance between buildings and template polygons, and the least squares model is used to control the geometry matching of the candidate template. Experiments on real datasets are carried out to assess the usefulness of this method and compare it with two existing methods. The experiments suggest that our method can preserve the main structure of building shapes and geometric accuracy. Full article
Figures

Figure 1

Open AccessArticle
Event-Driven Distributed Information Resource-Focusing Service for Emergency Response in Smart City with Cyber-Physical Infrastructures
ISPRS Int. J. Geo-Inf. 2017, 6(8), 251; doi:10.3390/ijgi6080251 -
Abstract
The smart city has become a popular topic of investigation. How to focus large amounts of distributed information resources to efficiently cope with public emergencies and provide support for personalized decision-making is a vitally important issue in the construction of smart cities. In
[...] Read more.
The smart city has become a popular topic of investigation. How to focus large amounts of distributed information resources to efficiently cope with public emergencies and provide support for personalized decision-making is a vitally important issue in the construction of smart cities. In this paper, an event-driven focusing service (EDFS) method that uses cyber-physical infrastructures for emergency response in smart cities is proposed. The method consists of a focusing service model at the top level, an informational representation of the model and a focusing service process to operate the service model in emergency response. The focusing service method follows an event-driven mechanism that allows the focusing service process to be triggered by public emergencies sensed by wireless sensor networks (WSNs) and mobile crowd sensing, and it integrates the requirements of different societal entities with regard to response to emergencies and information resources, thereby providing comprehensive and personalized support for decision-making. Furthermore, an EDFS prototype system is designed and implemented based on the proposed method. An experiment using a real-world scenario—the gas leakage in August 2014 in Taiyuan, China—is presented demonstrating the feasibility of the proposed method for assisting various societal entities in coping with and efficiently responding to public emergencies. Full article
Figures

Figure 1

Open AccessArticle
A Novel Approach for Publishing Linked Open Geodata from National Registries with the Use of Semantically Annotated Context Dependent Web Pages
ISPRS Int. J. Geo-Inf. 2017, 6(8), 252; doi:10.3390/ijgi6080252 -
Abstract
Many of the standards used to build spatial data infrastructure (SDI), such as Web Map Service (WMS) or Web Feature Service (WFS), have become outdated. They do not follow current web technology development and do not fully exploit its capabilities. Spatial data often
[...] Read more.
Many of the standards used to build spatial data infrastructure (SDI), such as Web Map Service (WMS) or Web Feature Service (WFS), have become outdated. They do not follow current web technology development and do not fully exploit its capabilities. Spatial data often remains available only through application programming interfaces (APIs), reflecting the persistence of organizational silos. The potential of the web for discovering knowledge hidden in data and discoverable through integration and fusion remains very difficult. This article presents a strategy to take advantage of these newer semantic web technologies for SDI. We describe the implementation of a public registry in the age of Web 3.0. Our goal is to convert existing geographic information systems (GIS) data into explicit knowledge that can be easily used for a variety of purposes. This turns SDI into a framework to utilize the many advantages of the web. In this paper we present the working prototype system developed for the province of Mazowieckie in Poland and describes the underlying concepts. Further development of this approach comes from using linked data (LD) with expert systems to support analysis functions and tasks. Full article
Figures

Open AccessArticle
Evaluating the Impact of Meteorological Factors on Water Demand in the Las Vegas Valley Using Time-Series Analysis: 1990–2014
ISPRS Int. J. Geo-Inf. 2017, 6(8), 249; doi:10.3390/ijgi6080249 -
Abstract
Many factors impact a city’s water consumption, including population distribution, average household income, water prices, water conservation programs, and climate. Of these, however, meteorological effects are considered to be the primary determinants of water consumption. In this study, the effects of climate on
[...] Read more.
Many factors impact a city’s water consumption, including population distribution, average household income, water prices, water conservation programs, and climate. Of these, however, meteorological effects are considered to be the primary determinants of water consumption. In this study, the effects of climate on residential water consumption in Las Vegas, Nevada, were examined during the period from 1990 to 2014. The investigations found that climatic variables, including maximum temperature, minimum temperature, average temperature, precipitation, diurnal temperature, dew point depression, wind speed, wind direction, and percent of calm wind influenced water use. The multivariate autoregressive integrated moving average (ARIMAX) model found that the historical data of water consumption and dew point depression explain the highest percentage of variance (98.88%) in water use when dew point depression is used as an explanatory variable. Our results indicate that the ARIMAX model with dew point depression input, and average temperature, play a significant role in predicting long-term water consumption rates in Las Vegas. The sensitivity analysis results also show that the changes in average temperature impacted water demand three times more than dew point depression. The accuracy performance, specifically the mean average percentage error (MAPE), of the model’s forecasting is found to be about 2–3% from five years out. This study can be adapted and utilized for the long-term forecasting of water demand in other regions. By using one significant climate factor and historical water demand for the forecasting, the ARIMAX model gives a forecast with high accuracy and provides an effective technique for monitoring the effects of climate change on water demand in the area. Full article
Figures

Figure 1

Open AccessArticle
A Generalized Additive Model Combining Principal Component Analysis for PM2.5 Concentration Estimation
ISPRS Int. J. Geo-Inf. 2017, 6(8), 248; doi:10.3390/ijgi6080248 -
Abstract
As an extension of the traditional Land Use Regression (LUR) modelling, the generalized additive model (GAM) was developed in recent years to explore the non-linear relationships between PM2.5 concentrations and the factors impacting it. However, these studies did not consider the loss
[...] Read more.
As an extension of the traditional Land Use Regression (LUR) modelling, the generalized additive model (GAM) was developed in recent years to explore the non-linear relationships between PM2.5 concentrations and the factors impacting it. However, these studies did not consider the loss of information regarding predictor variables. To address this challenge, a generalized additive model combining principal component analysis (PCA–GAM) was proposed to estimate PM2.5 concentrations in this study. The reliability of PCA–GAM for estimating PM2.5 concentrations was tested in the Beijing-Tianjin-Hebei (BTH) region over a one-year period as a case study. The results showed that PCA–GAM outperforms traditional LUR modelling with relatively higher adjusted R2 (0.94) and lower RMSE (4.08 µg/m3). The CV-adjusted R2 (0.92) is high and close to the model-adjusted R2, proving the robustness of the PCA–GAM model. The PCA–GAM model enhances PM2.5 estimate accuracy by improving the usage of the effective predictor variables. Therefore, it can be concluded that PCA–GAM is a promising method for air pollution mapping and could be useful for decision makers taking a series of measures to combat air pollution. Full article
Figures

Figure 1

Open AccessArticle
GIS-Based Visibility Network and Defensibility Model to Reconstruct Defensive System of the Han Dynasty in Central Xinjiang, China
ISPRS Int. J. Geo-Inf. 2017, 6(8), 247; doi:10.3390/ijgi6080247 -
Abstract
The Silk Road opened during the Han Dynasty, and is significant in global cultural communication. Along this route in the central part of Xinjiang, the archaeological sites with defensive characteristics once provided a safeguard for this area. Reconstructing the defensive system is an
[...] Read more.
The Silk Road opened during the Han Dynasty, and is significant in global cultural communication. Along this route in the central part of Xinjiang, the archaeological sites with defensive characteristics once provided a safeguard for this area. Reconstructing the defensive system is an important way to explore the ancient culture’s propagation and the organizational structure of these sites. In this research, the compound visibility network with complex network analysis (CNA) and the least-cost paths based on the defensibility models from linear and logistic regression methods constitute the principle defensive structure. As possible transportation corridors, these paths are considered to be mostly fitted to each other in general, and are different from normal slope-based paths. The sites Kuhne Shahr and Agra play important roles for information control according to the CNA measures, while the sites Kuhne Shahr and Kuyux Shahr are considered to be crucial cities due to their positions and structural shapes. Some other sites, including Uzgen Bulak, Shah Kalandar, Chuck Castle, Caladar, and Qiuci, as well as some beacons, have important effects on defending the transportation corridors. This method is proven efficient for the study of the historical role of archaeological sites with defensive characteristics. Full article
Figures

Figure 1

Open AccessArticle
Spatial Variation Relationship between Floating Population and Residential Burglary: A Case Study from ZG, China
ISPRS Int. J. Geo-Inf. 2017, 6(8), 246; doi:10.3390/ijgi6080246 -
Abstract
With the rapid development of China’s economy, the demand for labor in the coastal cities continues to grow. Due to restrictions imposed by China’s household registration system, a large number of floating populations have subsequently appeared. The relationship between floating populations and crime,
[...] Read more.
With the rapid development of China’s economy, the demand for labor in the coastal cities continues to grow. Due to restrictions imposed by China’s household registration system, a large number of floating populations have subsequently appeared. The relationship between floating populations and crime, however, is not well understood. This paper investigates the impact of a floating population on residential burglary on a fine spatial scale. The floating population was divided into the floating population from other provinces (FPFOP) and the floating population from the same province as ZG city (FPFSP), because of the high heterogeneity. Univariate spatial patterns in residential burglary and the floating population in ZG were explored using Moran’s I and LISA (local indicators of spatial association) models. Furthermore, a geographically weighted Poisson regression model, which addressed the spatial effects in the data, was employed to explore the relationship between the floating population and residential burglary. The results revealed that the impact of the floating population on residential burglary is complex. The floating population from the same province did not have a significant impact on residential burglary in most parts of the city, while the floating population from other provinces had a significantly positive impact on residential burglary in most of the study areas and the magnitude of this impact varied across the study area. Full article
Figures

Figure 1

Open AccessArticle
High-Resolution Remote Sensing Data Classification over Urban Areas Using Random Forest Ensemble and Fully Connected Conditional Random Field
ISPRS Int. J. Geo-Inf. 2017, 6(8), 245; doi:10.3390/ijgi6080245 -
Abstract
As an intermediate step between raw remote sensing data and digital maps, remote sensing data classification has been a challenging and long-standing problem in the remote sensing research community. In this work, an automated and effective supervised classification framework is presented for classifying
[...] Read more.
As an intermediate step between raw remote sensing data and digital maps, remote sensing data classification has been a challenging and long-standing problem in the remote sensing research community. In this work, an automated and effective supervised classification framework is presented for classifying high-resolution remote sensing data. Specifically, the presented method proceeds in three main stages: feature extraction, classification, and classified result refinement. In the feature extraction stage, both multispectral images and 3D geometry data are used, which utilizes the complementary information from multisource data. In the classification stage, to tackle the problems associated with too many training samples and take full advantage of the information in the large-scale dataset, a random forest (RF) ensemble learning strategy is proposed by combining several RF classifiers together. Finally, an improved fully connected conditional random field (FCCRF) graph model is employed to derive the contextual information to refine the classification results. Experiments on the ISPRS Semantic Labeling Contest dataset show that the presented 3-stage method achieves 86.9% overall accuracy, which is a new state-of-the-art non-CNN (convolutional neural networks)-based classification method. Full article
Figures

Figure 1

Open AccessCommunication
Nationwide Point Cloud—The Future Topographic Core Data
ISPRS Int. J. Geo-Inf. 2017, 6(8), 243; doi:10.3390/ijgi6080243 -
Abstract
Topographic databases maintained by national mapping agencies are currently the most common nationwide data sets in geo-information. The application of laser scanning as source data for surveying is increasing. Along with this development, several analysis methods that utilize dense point clouds have been
[...] Read more.
Topographic databases maintained by national mapping agencies are currently the most common nationwide data sets in geo-information. The application of laser scanning as source data for surveying is increasing. Along with this development, several analysis methods that utilize dense point clouds have been introduced. We present the concept of producing a dense nationwide point cloud, produced from multiple sensors and containing multispectral information, as the national core data for geo-information. Geo-information products, such as digital terrain and elevation models and 3D building models, are produced automatically from these data. We outline the data acquisition, processing, and application of the point cloud. As a national data set, a dense multispectral point cloud could produce significant cost savings via improved automation in mapping and a reduction of overlapping surveying efforts. Full article
Figures

Figure 1

Open AccessArticle
Wicked Water Points: The Quest for an Error Free National Water Point Database
ISPRS Int. J. Geo-Inf. 2017, 6(8), 244; doi:10.3390/ijgi6080244 -
Abstract
The Water Sector Development Programme (WSDP) of Tanzania aims to improve the performance of the water sector in general and rural water supply (RWS) in particular. During the first phase of the WSDP (2007 to 2014), implementing agencies developed information systems for attaining
[...] Read more.
The Water Sector Development Programme (WSDP) of Tanzania aims to improve the performance of the water sector in general and rural water supply (RWS) in particular. During the first phase of the WSDP (2007 to 2014), implementing agencies developed information systems for attaining management efficiencies. One of these systems, the Water Point Mapping System (WPMS), has now been completed, and the database is openly available to the public, as part of the country’s commitment to the Open Government Partnership (OGP) initiative. The Tanzanian WPMS project was the first attempt to map “wall-to-wall” all rural public water points in an African nation. The complexity of the endeavor led to suboptimal results in the quality of the WPMS database, the baseline of the WPMS. The WPMS database was a means for the future monitoring of all rural water points, but its construction has become an end in itself. We trace the challenges of water point mapping in Tanzania and describe how the WPMS database was initially populated and to what effect. The paper conceptualizes errors found in the WPMS database as material, observational, conceptual and discursive, and characterizes them in terms of type, suspected origin and mitigation options. The discussion focuses on the consequences of open data scrutiny for the integrity of the WPMS database and the implications for monitoring wicked water point data. Full article
Figures

Figure 1

Open AccessArticle
Continuous Scale Transformations of Linear Features Using Simulated Annealing-Based Morphing
ISPRS Int. J. Geo-Inf. 2017, 6(8), 242; doi:10.3390/ijgi6080242 -
Abstract
This paper presents a new method for use in performing continuous scale transformations of linear features using Simulated Annealing-Based Morphing (SABM). This study addresses two key problems in the continuous generalization of linear features by morphing, specifically the detection of characteristic points and
[...] Read more.
This paper presents a new method for use in performing continuous scale transformations of linear features using Simulated Annealing-Based Morphing (SABM). This study addresses two key problems in the continuous generalization of linear features by morphing, specifically the detection of characteristic points and correspondence matching. First, an algorithm that performs robust detection of characteristic points is developed that is based on the Constrained Delaunay Triangulation (CDT) model. Then, an optimal problem is defined and solved to associate the characteristic points between a coarser representation and a finer representation. The algorithm decomposes the input shapes into several pairs of corresponding segments and uses the simulated annealing algorithm to find the optimal matching. Simple straight-line trajectories are used to define the movements between corresponding points. The experimental results show that the SABM method can be used for continuous generalization and generates smooth, natural and visually pleasing linear features with gradient effects. In contrast to linear interpolation, the SABM method uses the simulated annealing technique to optimize the correspondence between characteristic points. Moreover, it avoids interior distortions within intermediate shapes and preserves the geographical characteristics of the input shapes. Full article
Figures

Figure 1

Open AccessArticle
The IMU/UWB Fusion Positioning Algorithm Based on a Particle Filter
ISPRS Int. J. Geo-Inf. 2017, 6(8), 235; doi:10.3390/ijgi6080235 -
Abstract
This paper integrates UWB (ultra-wideband) and IMU (Inertial Measurement Unit) data to realize pedestrian positioning through a particle filter in a non-line-of-sight (NLOS) environment. After the acceleration and angular velocity are integrated by the ZUPT-based algorithm, the velocity and orientation of the feet
[...] Read more.
This paper integrates UWB (ultra-wideband) and IMU (Inertial Measurement Unit) data to realize pedestrian positioning through a particle filter in a non-line-of-sight (NLOS) environment. After the acceleration and angular velocity are integrated by the ZUPT-based algorithm, the velocity and orientation of the feet are obtained, and then the velocity and orientation of the whole body are estimated by a virtual odometer method. This information will be adopted as the prior information for the particle filter, and the observation value of UWB will act as the basis for weight updating. According to experimental results, the prior information provided by an IMU can be used to restrain the observation error of UWB under an NLOS condition, and the positioning precision can be improved from the positioning error of 1.6 m obtained using the pure UWB-based algorithm to approximately 0.7 m. Moreover, with high computational efficiency, this algorithm can achieve real-time computing performance on ordinary embedded devices. Full article
Figures

Figure 1

Open AccessArticle
Disaster Risk Reduction in Agriculture through Geospatial (Big) Data Processing
ISPRS Int. J. Geo-Inf. 2017, 6(8), 238; doi:10.3390/ijgi6080238 -
Abstract
Intensive farming on land represents an increased burden on the environment due to, among other reasons, the usage of agrochemicals. Precision farming can reduce the environmental burden by employing site specific crop management practices which implement advanced geospatial technologies for respecting soil heterogeneity.
[...] Read more.
Intensive farming on land represents an increased burden on the environment due to, among other reasons, the usage of agrochemicals. Precision farming can reduce the environmental burden by employing site specific crop management practices which implement advanced geospatial technologies for respecting soil heterogeneity. The objectives of this paper are to present the frontier approaches of geospatial (Big) data processing based on satellite and sensor data which both aim at the prevention and mitigation phases of disaster risk reduction in agriculture. Three techniques are presented in order to demonstrate the possibilities of geospatial (Big) data collection in agriculture: (1) farm machinery telemetry for providing data about machinery operations on fields through the developed MapLogAgri application; (2) agrometeorological observation in the form of a wireless sensor network together with the SensLog solution for storing, analysing, and publishing sensor data; and (3) remote sensing for monitoring field spatial variability and crop status by means of freely-available high resolution satellite imagery. The benefits of re-using the techniques in disaster risk reduction processes are discussed. The conducted tests demonstrated the transferability of agricultural techniques to crisis/emergency management domains. Full article
Figures

Figure 1

Open AccessArticle
Farm Level Assessment of Irrigation Performance for Dairy Pastures in the Goulburn-Murray District of Australia by Combining Satellite-Based Measures with Weather and Water Delivery Information
ISPRS Int. J. Geo-Inf. 2017, 6(8), 239; doi:10.3390/ijgi6080239 -
Abstract
Pasture performance of 924 dairy farms in a major irrigation district of Australia was investigated for their water use and water productivity during the 2015-2016 summer which was the peak irrigation period. Using satellite images from Landsat-8 and Sentinel-2, estimates of crop coefficient
[...] Read more.
Pasture performance of 924 dairy farms in a major irrigation district of Australia was investigated for their water use and water productivity during the 2015-2016 summer which was the peak irrigation period. Using satellite images from Landsat-8 and Sentinel-2, estimates of crop coefficient (Kc) were determined on the basis of a strong linear relationship between crop evapotranspiration (ETc) and vegetation index (NDVI) of pasture in the region. Utilizing estimates of Kc and crop water requirement (CWR), NDVI-dependent estimates of Irrigation Water Requirement (IWR) were derived based on the soil water balance model. In combination with daily weather information and seasonal irrigation water supply records, IWR was the key component in the understanding of current irrigation status at farm level, and deriving two irrigation performance indicators: (1) Relative Irrigation Water Use (RIWU) and (2) Total Irrigation Water Productivity (TIWP). A slightly higher proportion of farm irrigators were found to be either matching the irrigation requirement or under-watering (RIWU ≤ 1.0). According to TIWP, a few dairy farms (3%) were found to be in the category of high yield potential with excess water use, and very few (1%) in the category of limited water supply to pastures of high yield potential. A relatively high number of farms were found to be in the category where excess water was supplied to pastures of low-medium yield potential (27%), and farms where water supply compromised pastures with a sub-maximal vegetation status (15%). The results of this study will assist in objectively identifying where significant improvement in efficient irrigation water use can be achieved. Full article
Figures

Figure 1

Open AccessArticle
A Hierarchical Approach for Measuring the Consistency of Water Areas between Multiple Representations of Tile Maps with Different Scales
ISPRS Int. J. Geo-Inf. 2017, 6(8), 240; doi:10.3390/ijgi6080240 -
Abstract
In geographic information systems, the reliability of querying, analysing, or reasoning results depends on the data quality. One central criterion of data quality is consistency, and identifying inconsistencies is crucial for maintaining the integrity of spatial data from multiple sources or at multiple
[...] Read more.
In geographic information systems, the reliability of querying, analysing, or reasoning results depends on the data quality. One central criterion of data quality is consistency, and identifying inconsistencies is crucial for maintaining the integrity of spatial data from multiple sources or at multiple resolutions. In traditional methods of consistency assessment, vector data are used as the primary experimental data. In this manuscript, we describe the use of a new type of raster data, tile maps, to access the consistency of information from multiscale representations of the water bodies that make up drainage systems. We describe a hierarchical methodology to determine the spatial consistency of tile-map datasets that display water areas in a raster format. Three characteristic indices, the degree of global feature consistency, the degree of local feature consistency, and the degree of overlap, are proposed to measure the consistency of multiscale representations of water areas. The perceptual hash algorithm and the scale-invariant feature transform (SIFT) descriptor are applied to extract and measure the global and local features of water areas. By performing combined calculations using these three characteristic indices, the degrees of consistency of multiscale representations of water areas can be divided into five grades: exactly consistent, highly consistent, moderately consistent, less consistent, and inconsistent. For evaluation purposes, the proposed method is applied to several test areas from the Tiandi map of China. In addition, we identify key technologies that are related to the process of extracting water areas from a tile map. The accuracy of the consistency assessment method is evaluated, and our experimental results confirm that the proposed methodology is efficient and accurate. Full article
Figures

Figure 1

Open AccessArticle
A Triangular Prism Spatial Interpolation Method for Mapping Geological Property Fields
ISPRS Int. J. Geo-Inf. 2017, 6(8), 241; doi:10.3390/ijgi6080241 -
Abstract
Abstract:Thespatial interpolation of property fields in 3D, such as the temperature, salinity, and organic content of ocean water, is an active area of research in the applied geosciences. Conventional interpolation methods have not adequately addressed anisotropy in these data. Thus,
[...] Read more.
Abstract:Thespatial interpolation of property fields in 3D, such as the temperature, salinity, and organic content of ocean water, is an active area of research in the applied geosciences. Conventional interpolation methods have not adequately addressed anisotropy in these data. Thus, in our research we considered two interpolation methods based on a triangular prism volume element, as a triangular prism structure best represents directivity, to express the anisotropy inherent in geological property fields. A linear triangular prism interpolation is proposed for layered stratum that achieves a complete continuity based on the volume coordinates of the triangular prism. A triangular prism quadric interpolation (a unit function of a triangular prism spline with 15 nodes) is designed for a smooth transition between adjacent triangular prisms with approximately continuity, expressing the continuity of the entire model. We designed a specific model which accounts for the different spatial correlations in three dimensions. We evaluated the accuracy of our proposed linear and triangular prism quadric interpolation methods with traditional inverse distance weighting (IDW) and kriging interpolation approaches in comparative experiments. The results show that, in 3D geological modeling, the linear and quadric triangular prism interpolations more accurately represent the changes in the property values of the layered strata than the IDW and kriging interpolation methods. Furthermore, the triangular prism quadric interpolation algorithm with 15 nodes outperforms the other methods. This study of triangular prism interpolation algorithms has implications for the expression of data fields with 3D properties. Moreover, our novel approach will contribute to spatial attribute prediction and representation and is applicable to all 3D geographic information; for example, in studies of atmospheric circulation, ocean circulation, water temperature, salinity, and three-dimensional pollutant diffusion. Full article
Figures

Figure 1

Open AccessArticle
Centrality as a Method for the Evaluation of Semantic Resources for Disaster Risk Reduction
ISPRS Int. J. Geo-Inf. 2017, 6(8), 237; doi:10.3390/ijgi6080237 -
Abstract
Clear and straightforward communication is a key aspect of all human activities related to crisis management. Since crisis management activities involve professionals from various disciplines using different terminology, clear and straightforward communication is difficult to achieve. Semantics as a broad science can help
[...] Read more.
Clear and straightforward communication is a key aspect of all human activities related to crisis management. Since crisis management activities involve professionals from various disciplines using different terminology, clear and straightforward communication is difficult to achieve. Semantics as a broad science can help to overcome communication difficulties. This research focuses on the evaluation of available semantic resources including ontologies, thesauri, and controlled vocabularies for disaster risk reduction as part of crisis management. The main idea of the study is that the most appropriate source of broadly understandable terminology is such a semantic resource, which is accepted by—or at least connected to the majority of other resources. Important is not only the number of interconnected resources, but also the concrete position of the resource in the complex network of Linked Data resources. Although this is usually done by user experience, objective methods of resource semantic centrality can be applied. This can be described by centrality methods used mainly in graph theory. This article describes the calculation of four types of centrality methods (Outdegree, Indegree, Closeness, and Betweenness) applied to 160 geographic concepts published as Linked Data and related to disaster risk reduction. Centralities were calculated for graph structures containing particular semantic resources as nodes and identity links as edges. The results show that (with some discussed exceptions) the datasets with high values of centrality serve as important information resources, but they also include more concepts from preselected 160 geographic concepts. Therefore, they could be considered as the most suitable resources of terminology to make communication in the domain easier. The main research goal is to automate the semantic resources evaluation and to apply a well-known theoretical method (centrality) to the semantic issues of Linked Data. It is necessary to mention the limits of this study: the number of tested concepts and the fact that centralities represents just one view on evaluation of semantic resources. Full article
Figures

Figure 1

Open AccessArticle
Spatial Characteristics of Twitter Users—Toward the Understanding of Geosocial Media Production
ISPRS Int. J. Geo-Inf. 2017, 6(8), 236; doi:10.3390/ijgi6080236 -
Abstract
Social media is a rich source of spatial data but it has also many flaws and well-known limitations, especially in regard to representation and representativeness, since very little is known about the demographics of the user population. At the same time, the use
[...] Read more.
Social media is a rich source of spatial data but it has also many flaws and well-known limitations, especially in regard to representation and representativeness, since very little is known about the demographics of the user population. At the same time, the use of locational services, is in fact, dependent on those characteristics. We address this gap in knowledge by exploring divides between Twitter users, based on the spatial and temporal distribution of the content they produce. We chose five cities and data from 2015 to represent different socio-spatial contexts. Users were classified according to spatial and non-spatial measures: home range estimation; standard distance; nearest neighbor index, and; proposed localness index. There are distinct groups of geosocial media producers, which suggests that such datasets cannot be treated as uniform representations. We found a positive correlation between spatial behavior and posting activity. It is suggested that there are universal patterns of behavior that are conditioned by software services—the example of Foucauldian “technologies of self”. They can also represent the dominance of the most prolific users over the whole data stream. Results are discussed in the context of the importance and role of user location in social media. Full article
Figures

Figure 1