Special Issue "Smart Cartography for Big Data Solutions"

A special issue of ISPRS International Journal of Geo-Information (ISSN 2220-9964).

Deadline for manuscript submissions: closed (10 June 2019).

Special Issue Editors

Prof. Dr. Temenoujka Bandrova
Website
Guest Editor
Head of Laboratory on Cartography, University of Architecture, Civil Engineering and Geodesy, 1, Chr. Smirnenski Blvd. 1046 Sofia Bulgaria
Interests: 3D cartography; school atlases; mapping for EWCM; web-cartography
Prof. Dr. Milan Konecny
Website
Guest Editor
Masaryk University, Brno, Czech Republic and Shenzhen University, Shenzhen, P.R. China
Interests: smart cartography; big data; early warning and crisis management; disaster risk reduction; smart cities
Prof. Dr. Hui Lin

Guest Editor
The Chinese University of Hong Kong
Interests: virtual geographic environments

Special Issue Information

Dear Colleagues,

The increasing amount of data encourages the creation of new methodologies for data processing, as well as the development of digital technologies. Both are progressively yielding new potentials and possibilities in the evolution of cartographic analyses, synthesis, visualization, and their applications. This trend also leads to new challenges in cartography due to dealing with the gathering, storage, analysis, and visualization of spatial information and data. Cartography is one of the few visualization disciplines to has always used, and correctly analyzed a huge amount of data, and represented them on different levels of preciseness according to the needs of potential users. The bridge between Big Data (BD) and society cannot be achieved by only existing technologies and computers. The presence of professionals should be more active in the process of transforming BD into actionable information for users.

Making data understandable for their users is a key aspect of cartography science. That is why BD is actually an excellent opportunity for cartographers to represent even more information in a convenient manner according to different user groups: From young children to professionals and older people. The EU and several large companies also highlight the impact of BD on society. Among the most important geodata and geoinformation Global and EU projects in implementation is INSPIRE and UN GGIM and soon it will need to enhance visualization phase to improve all kinds of services for inhabitants. However, the tasks of visualization are too complex. Multidisciplinary research vision and GI specialists from different fields will be needed to elaborate and increase the value of cartographic methods, in general, and of visualization in particular, for society. The realization of ambitious BD projects also require a different manner of thinking than technical and GI ones. A successful solution would be to establish teams, with subject matter experts coming from different areas that have not yet worked  together until now. Professionals from different disciplines, such as geography (physical, human, and economical), cartography, and geoinformatics, to try to design complex solutions to respond to different challenges. BD provides them with an opportunity to combine their knowledge and to develop complementary approaches to facilitate everyday decision making, problem solving, and improvements to life. These steps fulfill the old dream of cartographers; that their methods and approaches are considered to be generally valid (like mathematics, statistics, philosophy, etc.) and have been used by all scientists working with spatial information.

BD creates challenges that will introduce changes to humanity and the social sciences research landscape. In a BD environment, traditional research is transformed into computationally-based research methods. One of the key issues is how to provide users with easily-understandable information and knowledge based on BD. Many companies already have several petabytes of data stored, and this amount of data is growing by dozens of percentages. Cartography is already dealing with both types (classified and unclassified) of data, and, because of this, many issues, such as multidimensionality, generalization, scale, and format transformations suffer and wait for new smart solutions. Now, we have to admit it: Big Data exists, and something more than the traditional approach is expected. A new approach to problems will be created. Data shall be further explained, new messages for current and past problems shall be taken into account, and user involvement shall be brought to another level. New methodologies have to be adapted to users through their involvement in the methodology development process. Examples for this are already in place. We face an urgent need for virtual GIS, ubiquitous mapping, including context and adaptive mapping, and a remarkable inclusion of VGI. All these issues are now object to modern smart cartography approach.

Still there is still an ongoing discussion about the meaning of the word “smart“. We certainly need to develop and integrate new methodologies and technologies, but smart probably starts to be valid from the moment we create new and higher quality of the problem solutions. This is also one of the intentions of this Special Issue.

Prof. Dr. Temenoujka Bandrova
Prof. Dr. Milan Konecny
Prof. Dr. Hui Lin
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. ISPRS International Journal of Geo-Information is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Big data challenges for geo-scientists
  • Data, Conceptual and Theoretical Driven Science
  • Geoinformation for Smart Cities
  • Multi-D Mapping and Visualization
  • Web, Virtual, Augmented Cartographic Modeling
  • Geo-Spatial Analysis and Data Mining
  • Remote Sensing Technologies
  • Volunteer Geographic Information
  • Virtual Geographic Environments
  • Cartographic Communication for Big Data Era

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

Open AccessArticle
User Evaluation of Map-Based Visual Analytic Tools
ISPRS Int. J. Geo-Inf. 2019, 8(8), 363; https://doi.org/10.3390/ijgi8080363 - 20 Aug 2019
Cited by 1
Abstract
Big data have also become a big challenge for cartographers, as the majority of big data may be localized. The use of visual analytics tools, as well as comprising interactive maps, stimulates inter-disciplinary actors to explore new ideas and decision-making methods. This paper [...] Read more.
Big data have also become a big challenge for cartographers, as the majority of big data may be localized. The use of visual analytics tools, as well as comprising interactive maps, stimulates inter-disciplinary actors to explore new ideas and decision-making methods. This paper deals with the evaluation of three map-based visual analytics tools by means of the eye-tracking method. The conceptual part of the paper begins with an analysis of the state-of-the-art and ends with the design of proof-of-concept experiments. The verification part consists of the design, composition, and realization of the conducted eye-tracking experiment, in which three map-based visual analytics tools were tested in terms of user-friendliness. A set of recommendations on GUI (graphical user interface) design and interactive functionality for map makers is formulated on the basis of the discovered errors and shortcomings in the assessed stimuli. The results of the verification were used as inputs for improving the three tested map-based visual analytics tools and might serve as a best practice for map-based visual analytics tools in general, as well as for improving the policy making cycle as elaborated by the European project PoliVisu (Policy Development based on Advanced Geospatial Data Analytics and Visualization). Full article
(This article belongs to the Special Issue Smart Cartography for Big Data Solutions)
Show Figures

Figure 1

Open AccessArticle
Regionalization Analysis and Mapping for the Source and Sink of Tourist Flows
ISPRS Int. J. Geo-Inf. 2019, 8(7), 314; https://doi.org/10.3390/ijgi8070314 - 23 Jul 2019
Cited by 1Correction
Abstract
At present, population mobility for the purpose of tourism has become a popular phenomenon. As it becomes easier to capture big data on the tourist digital footprint, it is possible to analyze the respective regional features and driving forces for both tourism sources [...] Read more.
At present, population mobility for the purpose of tourism has become a popular phenomenon. As it becomes easier to capture big data on the tourist digital footprint, it is possible to analyze the respective regional features and driving forces for both tourism sources and destination regions at a macro level. Based on the data of tourist flows to Nanjing on five short-period national holidays in China, this study first calculated the travel rate of tourist source regions (315 cities) and the geographical concentration index of the visited attractions (51 scenic spots). Then, the spatial autocorrelation metrics index was used to analyze the global autocorrelation of the travel rates of tourist source regions and the geographical concentration index of the tourist destinations on five short-term national holidays. Finally, a heuristic unsupervised machine-learning method was used to analyze and map tourist sources and visited attractions by adopting the travel rate and the geographical concentration index accordingly as regionalized variables. The results indicate that both source and sink regions expressed distinctive regional differentiation patterns in the corresponding regional variables. This study method provides a practical tool for analyzing regionalization of big data in tourist flows, and it can also be applied to other origin-destination (OD) studies. Full article
(This article belongs to the Special Issue Smart Cartography for Big Data Solutions)
Show Figures

Figure 1

Open AccessArticle
Research on the Construction Method of the Service-Oriented Web-SWMM System
ISPRS Int. J. Geo-Inf. 2019, 8(6), 268; https://doi.org/10.3390/ijgi8060268 - 07 Jun 2019
Cited by 2
Abstract
On a global scale, with the acceleration of urbanization and the continuous expansion of cities, the problem of urban flooding has become increasingly prominent. An increasing number of experts and scholars have begun to focus on this phenomenon and build corresponding models to [...] Read more.
On a global scale, with the acceleration of urbanization and the continuous expansion of cities, the problem of urban flooding has become increasingly prominent. An increasing number of experts and scholars have begun to focus on this phenomenon and build corresponding models to solve the problem. The storm water management model 5 (SWMM5) is a dynamic rainfall-runoff simulation model developed by the US Environmental Protection Agency (EPA); this model simulates urban flooding and drainage well and is widely favored by researchers. However, the use of SWMM5 is relatively cumbersome and limited by the operational platform, and these factors hinder the further promotion and sharing of SWMM5. Based on the OpenGMS platform, this study first encapsulates, deploys, and publishes SWMM5 and further builds the Web-SWMM system for the model. With Web-SWMM, the user can conveniently use network data resources online and call SWMM5 to carry out calculations, avoiding the difficulties caused by the localized use of SWMM5 and enabling the sharing and reuse of SWMM5. Full article
(This article belongs to the Special Issue Smart Cartography for Big Data Solutions)
Show Figures

Figure 1

Open AccessArticle
A Hybrid of Differential Evolution and Genetic Algorithm for the Multiple Geographical Feature Label Placement Problem
ISPRS Int. J. Geo-Inf. 2019, 8(5), 237; https://doi.org/10.3390/ijgi8050237 - 21 May 2019
Abstract
Label placement is a difficult problem in automated map production. Many methods have been proposed to automatically place labels for various types of maps. While the methods are designed to automatically and effectively generate labels for the point, line and area features, less [...] Read more.
Label placement is a difficult problem in automated map production. Many methods have been proposed to automatically place labels for various types of maps. While the methods are designed to automatically and effectively generate labels for the point, line and area features, less attention has been paid to the problem of jointly labeling all the different types of geographical features. In this paper, we refer to the labeling of all the graphic features as the multiple geographical feature label placement (MGFLP) problem. In the MGFLP problem, the overlapping and occlusion among labels and corresponding features produces poorly arranged labels, and results in a low-quality map. To solve the problem, a hybrid algorithm combining discrete differential evolution and the genetic algorithm (DDEGA) is proposed to search for an optimized placement that resolves the MGFLP problem. The quality of the proposed solution was evaluated using a weighted metric regarding a number of cartographical rules. Experiments were carried out to validate the performance of the proposed method in a set of cartographic tasks. The resulting label placement demonstrates the feasibility and the effectiveness of our method. Full article
(This article belongs to the Special Issue Smart Cartography for Big Data Solutions)
Show Figures

Figure 1

Open AccessArticle
GIS Mapping of Driving Behavior Based on Naturalistic Driving Data
ISPRS Int. J. Geo-Inf. 2019, 8(5), 226; https://doi.org/10.3390/ijgi8050226 - 09 May 2019
Cited by 8
Abstract
Naturalistic driving can generate huge datasets with great potential for research. However, to analyze the collected data in naturalistic driving trials is quite complex and difficult, especially if we consider that these studies are commonly conducted by research groups with somewhat limited resources. [...] Read more.
Naturalistic driving can generate huge datasets with great potential for research. However, to analyze the collected data in naturalistic driving trials is quite complex and difficult, especially if we consider that these studies are commonly conducted by research groups with somewhat limited resources. It is quite common that these studies implement strategies for thinning and/or reducing the data volumes that have been initially collected. Thus, and unfortunately, the great potential of these datasets is significantly constrained to specific situations, events, and contexts. For this, to implement appropriate strategies for the visualization of these data is becoming increasingly necessary, at any scale. Mapping naturalistic driving data with Geographic Information Systems (GIS) allows for a deeper understanding of our driving behavior, achieving a smarter and broader perspective of the whole datasets. GIS mapping allows for many of the existing drawbacks of the traditional methodologies for the analysis of naturalistic driving data to be overcome. In this article, we analyze which are the main assets related to GIS mapping of such data. These assets are dominated by the powerful interface graphics and the great operational capacity of GIS software. Full article
(This article belongs to the Special Issue Smart Cartography for Big Data Solutions)
Show Figures

Figure 1

Open AccessArticle
Cartographic Line Generalization Based on Radius of Curvature Analysis
ISPRS Int. J. Geo-Inf. 2018, 7(12), 477; https://doi.org/10.3390/ijgi7120477 - 12 Dec 2018
Abstract
Cartographic generalization is one of the important processes of transforming the content of both analogue and digital maps. The process of reducing details on the map has to be conducted in a planned way in each case when the map scale is to [...] Read more.
Cartographic generalization is one of the important processes of transforming the content of both analogue and digital maps. The process of reducing details on the map has to be conducted in a planned way in each case when the map scale is to be reduced. As far as digital maps are concerned, numerous algorithms are used for the generalization of vector line elements. They are used if the scale of the map (on screen or printed) is changed, or in the process of smoothing vector lines (e.g., contours). The most popular method of reducing the number of vertices of a vector line is the Douglas-Peucker algorithm. An important feature of most algorithms is the fact that they do not take into account the cartographic properties of the transformed map element. Having analysed the existing methods of generalization, the authors developed a proprietary algorithm that is based on the analysis of the curvature of the vector line and fulfils the condition of objective generalization for elements of digital maps that may be used to transform open and closed vector lines. The paper discusses the operation of this algorithm, along with the graphic presentation of the generalization results for vector lines and the analysis of their accuracy. Treating the set of verification radii of a vector line as a statistical series, the authors propose applying statistical indices of position of these series, connected with the shape of the vector line, as the threshold parameters of generalization. The developed algorithm allows for linking the generalization parameters directly to the scale of the topographic map that was obtained after generalization. The results of the operation of the algorithm were compared to the results of the reduction of vertices with use of the Douglas-Peucker algorithm. The results demonstrated that the proposed algorithm not only reduced the number of vertices, but that it also smoothed the shape of physiographic lines, if applied to them. The authors demonstrated that the errors of smoothing and position of vertices did not exceed the acceptable values for the relevant scales of topographic maps. The developed algorithm allows for adjusting the surface of the generalized areas to their initial value more precisely. The advantage of the developed algorithm consists in the possibility to apply statistical indices that take the shape of lines into account to define the generalization parameters. Full article
(This article belongs to the Special Issue Smart Cartography for Big Data Solutions)
Show Figures

Figure 1

Review

Jump to: Research

Open AccessReview
Performance Testing on Marker Clustering and Heatmap Visualization Techniques: A Comparative Study on JavaScript Mapping Libraries
ISPRS Int. J. Geo-Inf. 2019, 8(8), 348; https://doi.org/10.3390/ijgi8080348 - 01 Aug 2019
Cited by 1
Abstract
We are now generating exponentially more data from more sources than a few years ago. Big data, an already familiar term, has been generally defined as a massive volume of structured, semi-structured, and/or unstructured data, which may not be effectively managed and processed [...] Read more.
We are now generating exponentially more data from more sources than a few years ago. Big data, an already familiar term, has been generally defined as a massive volume of structured, semi-structured, and/or unstructured data, which may not be effectively managed and processed using traditional databases and software techniques. It could be problematic to visualize easily and quickly a large amount of data via an Internet platform. From this perspective, the main aim of the paper is to test point data visualization possibilities of selected JavaScript Mapping Libraries to measure their performance and ability to cope with a big amount of data. Nine datasets containing 10,000 to 3,000,000 points were generated from the Nature Conservation Database. Five libraries for marker clustering and two libraries for heatmap visualization were analyzed. Loading time and the ability to visualize large data sets were compared for each dataset and each library. The best-evaluated library was a Mapbox GL JS (Graphics Library JavaScript) with the highest overall performance. Some of the tested libraries were not able to handle the desired amount of data. In general, an amount of less than 100,000 points was indicated as the threshold for implementation without a noticeable slowdown in performance. Their usage can be a limiting factor for point data visualization in such a dynamic environment as we live nowadays. Full article
(This article belongs to the Special Issue Smart Cartography for Big Data Solutions)
Show Figures

Figure 1

Back to TopTop