ISPRS Int. J. Geo-Inf.2016, 5(5), 56; doi:10.3390/ijgi5050056 - published 28 April 2016 Show/Hide Abstract
Abstract: Since its inception, Twitter has played a major role in real-world events—especially in the aftermath of disasters and catastrophic incidents, and has been increasingly becoming the first point of contact for users wishing to provide or seek information about such situations. The use of Twitter in emergency response and disaster management opens up avenues of research concerning different aspects of Twitter data quality, usefulness and credibility. A real challenge that has attracted substantial attention in the Twitter research community exists in the location inference of twitter data. Considering that less than 2% of tweets are geotagged, finding location inference methods that can go beyond the geotagging capability is undoubtedly the priority research area. This is especially true in terms of emergency response, where spatial aspects of information play an important role. This paper introduces a multi-elemental location inference method that puts the geotagging aside and tries to predict the location of tweets by exploiting the other inherently attached data elements. In this regard, textual content, users’ profile location and place labelling, as the main location-related elements, are taken into account. Location-name classes in three granularity levels are defined and employed to look up the location references from the location-associated elements. The inferred location of the finest granular level is assigned to a tweet, based on a novel location assignment rule. The location assigned by the location inference process is considered to be the inferred location of a tweet, and is compared with the geotagged coordinates as the ground truth of the study. The results show that this method is able to successfully infer the location of 87% of the tweets at the average distance error of 12.2 km and the median distance error of 4.5 km, which is a significant improvement compared with that of the current methods that can predict the location with much larger distance errors or at a city-level resolution at best.
ISPRS Int. J. Geo-Inf.2016, 5(5), 57; doi:10.3390/ijgi5050057 - published 28 April 2016 Show/Hide Abstract
Abstract: The Gorganrood watershed (GW) is experiencing considerable environmental change in the form of natural hazards and erosion, as well as deforestation, cultivation and development activities. As a result of this, different types of Land Cover/Land Use (LCLU) change are taking place on an intensive level in the area. This research study investigates the LCLU conditions upstream of this watershed for the years 1972, 1986, 2000 and 2014, using Landsat MSS, TM, ETM+ and OLI/TIRS images. LCLU maps for 1972, 1986, and 2000 were produced using pixel-based classification methods. For the 2014 LCLU map, Geographic Object-Based Image Analysis (GEOBIA) in combination with the data-mining capabilities of Gini and J48 machine-learning algorithms were used. The accuracy of the maps was assessed using overall accuracy, quantity disagreement and allocation disagreement indexes. The overall accuracy ranged from 89% to 95%, quantity disagreement from 2.1% to 6.6%, and allocation disagreement from 2.1% for 2014 to 2.7% for 2000. The results of this study indicate that a significant amount of change has occurred in the region, and that this has as a consequence affected ecosystem services and human activity. This knowledge of the LCLU status in the area will help managers and decision makers to develop plans and programs aimed at effectively managing the watershed into the future.
ISPRS Int. J. Geo-Inf.2016, 5(5), 55; doi:10.3390/ijgi5050055 - published 27 April 2016 Show/Hide Abstract
Abstract: Citizens are increasingly becoming an important source of geographic information, sometimes entering domains that had until recently been the exclusive realm of authoritative agencies. This activity has a very diverse character as it can, amongst other things, be active or passive, involve spatial or aspatial data and the data provided can be variable in terms of key attributes such as format, description and quality. Unsurprisingly, therefore, there are a variety of terms used to describe data arising from citizens. In this article, the expressions used to describe citizen sensing of geographic information are reviewed and their use over time explored, prior to categorizing them and highlighting key issues in the current state of the subject. The latter involved a review of ~100 Internet sites with particular focus on their thematic topic, the nature of the data and issues such as incentives for contributors. This review suggests that most sites involve active rather than passive contribution, with citizens typically motivated by the desire to aid a worthy cause, often receiving little training. As such, this article provides a snapshot of the role of citizens in crowdsourcing geographic information and a guide to the current status of this rapidly emerging and evolving subject.
ISPRS Int. J. Geo-Inf.2016, 5(5), 54; doi:10.3390/ijgi5050054 - published 25 April 2016 Show/Hide Abstract
Abstract: Big geospatial data are archived and made available through online web discovery and access. However, finding the right data for scientific research and application development is still a challenge. This paper aims to improve the data discovery by mining the user knowledge from log files. Specifically, user web session reconstruction is focused upon in this paper as a critical step for extracting usage patterns. However, reconstructing user sessions from raw web logs has always been difficult, as a session identifier tends to be missing in most data portals. To address this problem, we propose two session identification methods, including time-clustering-based and time-referrer-based methods. We also present the workflow of session reconstruction and discuss the approach of selecting appropriate thresholds for relevant steps in the workflow. The proposed session identification methods and workflow are proven to be able to extract data access patterns for further pattern analyses of user behavior and improvement of data discovery for more relevancy data ranking, suggestion, and navigation.
ISPRS Int. J. Geo-Inf.2016, 5(4), 53; doi:10.3390/ijgi5040053 - published 22 April 2016 Show/Hide Abstract
Abstract: The physical storage model is one of the key technologies for vehicle navigation maps used in a navigation system. However, the performance of most traditional storage models is limited in dynamic navigation due to the static storage format they use. In this paper, we proposed a new physical storage model, China Navigation Data Format (CNDF), which helped access and update the navigation data. The CNDF model used the reach-based hierarchy method to build a road hierarchal network, which enhanced the efficiency of data compression. It also adopted the Linear Link Coding method, in which the start position was combined with the end position as the identification code for multi-level links, and each link traced up-level links consistently without recording the array of identifications. The navigation map of East China (including Beijing, Tianjin, Shandong, Hebei, and Jiangsu) at 1:10,000, generated using the CNDF model, and the real time traffic information in Beijing were combined to test the performance of a navigation system using an embedded navigation device. Results showed that it cost less than 1 second each time to refresh the navigation map, and the accuracy of the hierarchal shortest-path algorithm was 99.9%. Our work implied that the CNDF model is efficient in vehicle navigation applications.
ISPRS Int. J. Geo-Inf.2016, 5(4), 51; doi:10.3390/ijgi5040051 - published 14 April 2016 Show/Hide Abstract
Abstract: The widely used pull-based method for high-frequency sensor data acquisition from Sensor Observation Services (SOS) is not efficient in real-time applications; therefore, further attention must be paid to real-time mechanisms in the provision process if sensor webs are to achieve their full potential. To address this problem, we created a data provision problem model, and compare the recursive algorithm Kalman Filter (KF) and our two proposed self-adaptive linear algorithms Harvestor Additive Increase and Multiplicative Decrease (H-AIMD) and Harvestor Multiplicative Increase and Additive Decrease (H-MIAD) with the commonly used Static Policy, which requests data with an unchanged time interval. We also developed a comprehensive performance evaluation method that considers the real-time capacity and resource waste to compare the performance of the four data provision algorithms. Experiments with real sensor data show that the Static Policy needs accurate priori parameters, Kalman Filter is most suitable for the data provision of sensors with long-term stable time intervals, and H-AIMD is the steadiest with better efficiency and less delayed number of data while with a higher resource waste than the others for data streams with much fluctuations in time intervals. The proposed model and algorithms are useful as a basic reference for real-time applications by pull-based stream data acquisition.