Special Issue "Big Data Challenges in Smart Cities"

A special issue of Data (ISSN 2306-5729).

Deadline for manuscript submissions: closed (30 November 2018).

Special Issue Editor

Guest Editor
Prof. Dr. Robert Laurini

Knowledge Systems Institute, Chicago, IL, USA; Institut National des Sciences Appliquées de Lyon, University of Lyon, France
Website | E-Mail
Interests: Smart Cities

Special Issue Information

Dear Colleagues,

Each day, local authorities are collecting zillions of bytes of data and they urgently long whether those data can be useful in decision-making. The so-called big data are coming from various sources, such as from real-time sensors for air pollution, traffic management and energy management, video-surveillance, administrative forms, GIS 2D or 3D data, GPS tracks, aerial photos, videos from drones, etc., without forgetting crowdsourcing for VGI and public participation.

For local administrators and elected officials in smart cities, the optimal use of their big data is very important, since ICT must not be the only the main resource, but rather the overall core of their smart governance.

Various challenges are emerging: How to structure big data? How to combine them efficiently? How to query them? How to extract knowledge? How to extract salient features, determining patterns and trends? How to combine them with deep learning? How to visualize them? How to integrate them into urban dashboards? How to preserve privacy? What are the best strategies for storing them? Surely many other challenges will appear.

In this Special Issue, we are especially interested in original papers dealing with these aspects, and/or describing novel experiences, as well as enriching big data theories with geographic aspects.

Prof. Dr. Robert Laurini
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Data is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • big data
  • smart cities
  • smart governance
  • urban knowledge extraction
  • geographic knowledge

Published Papers (6 papers)

View options order results:
result details:
Displaying articles 1-6
Export citation of selected articles as:

Research

Jump to: Review, Other

Open AccessArticle
Innovating Metrics for Smarter, Responsive Cities
Received: 31 January 2019 / Revised: 31 January 2019 / Accepted: 2 February 2019 / Published: 6 February 2019
PDF Full-text (2299 KB) | HTML Full-text | XML Full-text
Abstract
This paper explores the emerging and evolving landscape for metrics in smart cities in relation to big data challenges. Based on a review of the research literature, the problem of “synthetic quantitative indicators” along with concerns for “measuring urban realities” and “making metrics [...] Read more.
This paper explores the emerging and evolving landscape for metrics in smart cities in relation to big data challenges. Based on a review of the research literature, the problem of “synthetic quantitative indicators” along with concerns for “measuring urban realities” and “making metrics meaningful” are identified. In response, the purpose of this paper is to advance the need for innovating metrics for smarter, more interactive and responsive cities in addressing and mitigating algorithmic-related challenges on the one hand, and concerns associated with involving people more meaningfully on the other hand. As such, the constructs of awareness, learning, openness, and engagement are employed in this study. Using an exploratory case study approach, the research design for this work includes the use of multiple methods of data collection including survey and interviews. Employing a combination of content analysis for qualitative data and descriptive statistics for quantitative data, the main findings of this work support the need for rethinking and innovating metrics. As such, the main conclusion of this paper highlights the potential for developing new pathways and spaces for involving people more directly, knowingly, and meaningfully in addressing big and small data challenges for the innovating of urban metrics. Full article
(This article belongs to the Special Issue Big Data Challenges in Smart Cities)
Figures

Figure 1

Open AccessArticle
Data Governance and Sovereignty in Urban Data Spaces Based on Standardized ICT Reference Architectures
Received: 30 November 2018 / Revised: 15 January 2019 / Accepted: 15 January 2019 / Published: 18 January 2019
PDF Full-text (915 KB) | HTML Full-text | XML Full-text
Abstract
European cities and communities (and beyond) require a structured overview and a set of tools as to achieve a sustainable transformation towards smarter cities/municipalities, thereby leveraging on the enormous potential of the emerging data driven economy. This paper presents the results of a [...] Read more.
European cities and communities (and beyond) require a structured overview and a set of tools as to achieve a sustainable transformation towards smarter cities/municipalities, thereby leveraging on the enormous potential of the emerging data driven economy. This paper presents the results of a recent study that was conducted with a number of German municipalities/cities. Based on the obtained and briefly presented recommendations emerging from the study, the authors propose the concept of an Urban Data Space (UDS), which facilitates an eco-system for data exchange and added value creation thereby utilizing the various types of data within a smart city/municipality. Looking at an Urban Data Space from within a German context and considering the current situation and developments in German municipalities, this paper proposes a reasonable classification of urban data that allows the relation of various data types to legal aspects, and to conduct solid considerations regarding technical implementation designs and decisions. Furthermore, the Urban Data Space is described/analyzed in detail, and relevant stakeholders are identified, as well as corresponding technical artifacts are introduced. The authors propose to setup Urban Data Spaces based on emerging standards from the area of ICT reference architectures for Smart Cities, such as DIN SPEC 91357 “Open Urban Platform” and EIP SCC. In the course of this, the paper walks the reader through the construction of a UDS based on the above-mentioned architectures and outlines all the goals, recommendations and potentials, which an Urban Data Space can reveal to a municipality/city. Finally, we aim at deriving the proposed concepts in a way that they have the potential to be part of the required set of tools towards the sustainable transformation of German and European cities in the direction of smarter urban environments, based on utilizing the hidden potential of digitalization and efficient interoperable data exchange. Full article
(This article belongs to the Special Issue Big Data Challenges in Smart Cities)
Figures

Graphical abstract

Open AccessArticle
An Effective and Efficient Adaptive Probability Data Dissemination Protocol in VANET
Received: 27 November 2018 / Revised: 15 December 2018 / Accepted: 18 December 2018 / Published: 21 December 2018
PDF Full-text (5542 KB) | HTML Full-text | XML Full-text
Abstract
Mobile network topology changes dynamically over time because of the high velocity of vehicles. Therefore, the concept of the data dissemination scheme in a VANET environment has become an issue of debate for many research scientists. The main purpose of VANET is to [...] Read more.
Mobile network topology changes dynamically over time because of the high velocity of vehicles. Therefore, the concept of the data dissemination scheme in a VANET environment has become an issue of debate for many research scientists. The main purpose of VANET is to ensure passenger safety application by considering the critical emergency message. The design of the message dissemination protocol should take into consideration effective data dissemination to provide a high packet data ratio and low end-to-end delay by using network resources at a minimal level. In this paper, an effective and efficient adaptive probability data dissemination protocol (EEAPD) is proposed. EEAPD comprises a delay scheme and probabilistic approach. The redundancy ratio (r) metric is used to explain the correlation between road segments and vehicles’ density in rebroadcast probability decisions. The uniqueness of the EEAPD protocol comes from taking into account the number of road segments to decide which nodes are suitable for rebroadcasting the emergency message. The last road segment is considered in the transmission range because of the probability of it having small vehicle density. From simulation results, the proposed protocol provides a better high-packet delivery ratio and low-packet drop ratio by providing better use of the network resource within low end-to-end delay. This protocol is designed for only V2V communication by considering a beaconless strategy. the simulations in this study were conducted using Ns-3.26 and traffic simulator called “SUMO”. Full article
(This article belongs to the Special Issue Big Data Challenges in Smart Cities)
Figures

Figure 1

Open AccessArticle
Congestion Adaptive Traffic Light Control and Notification Architecture Using Google Maps APIs
Received: 19 September 2018 / Revised: 11 November 2018 / Accepted: 12 December 2018 / Published: 14 December 2018
Cited by 1 | PDF Full-text (4268 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
Traffic jams can be avoided by controlling traffic signals according to quickly building congestion with steep gradients on short temporal and small spatial scales. With the rising standards of computational technology, single-board computers, software packages, platforms, and APIs (Application Program Interfaces), it has [...] Read more.
Traffic jams can be avoided by controlling traffic signals according to quickly building congestion with steep gradients on short temporal and small spatial scales. With the rising standards of computational technology, single-board computers, software packages, platforms, and APIs (Application Program Interfaces), it has become relatively easy for developers to create systems for controlling signals and informative systems. Hence, for enhancing the power of Intelligent Transport Systems in automotive telematics, in this study, we used crowdsourced traffic congestion data from Google to adjust traffic light cycle times with a system that is adaptable to congestion. One aim of the system proposed here is to inform drivers about the status of the upcoming traffic light on their route. Since crowdsourced data are used, the system does not entail the high infrastructure cost associated with sensing networks. A full system module-level analysis is presented for implementation. The system proposed is fail-safe against temporal communication failure. Along with a case study for examining congestion levels, generic information processing for the cycle time decision and status delivery system was tested and confirmed to be viable and quick for a restricted prototype model. The information required was delivered correctly over sustained trials, with an average time delay of 1.5 s and a maximum of 3 s. Full article
(This article belongs to the Special Issue Big Data Challenges in Smart Cities)
Figures

Figure 1

Review

Jump to: Research, Other

Open AccessReview
Deep Learning in Data-Driven Pavement Image Analysis and Automated Distress Detection: A Review
Received: 15 June 2018 / Revised: 5 July 2018 / Accepted: 18 July 2018 / Published: 24 July 2018
Cited by 4 | PDF Full-text (1061 KB) | HTML Full-text | XML Full-text
Abstract
Deep learning, more specifically deep convolutional neural networks, is fast becoming a popular choice for computer vision-based automated pavement distress detection. While pavement image analysis has been extensively researched over the past three decades or so, recent ground-breaking achievements of deep learning algorithms [...] Read more.
Deep learning, more specifically deep convolutional neural networks, is fast becoming a popular choice for computer vision-based automated pavement distress detection. While pavement image analysis has been extensively researched over the past three decades or so, recent ground-breaking achievements of deep learning algorithms in the areas of machine translation, speech recognition, and computer vision has sparked interest in the application of deep learning to automated detection of distresses in pavement images. This paper provides a narrative review of recently published studies in this field, highlighting the current achievements and challenges. A comparison of the deep learning software frameworks, network architecture, hyper-parameters employed by each study, and crack detection performance is provided, which is expected to provide a good foundation for driving further research on this important topic in the context of smart pavement or asset management systems. The review concludes with potential avenues for future research; especially in the application of deep learning to not only detect, but also characterize the type, extent, and severity of distresses from 2D and 3D pavement images. Full article
(This article belongs to the Special Issue Big Data Challenges in Smart Cities)
Figures

Figure 1

Other

Jump to: Research, Review

Open AccessData Descriptor
The Historical Small Smart City Protocol (HISMACITY): Toward an Intelligent Tool Using Geo Big Data for the Sustainable Management of Minor Historical Assets
Received: 30 November 2018 / Revised: 7 February 2019 / Accepted: 7 February 2019 / Published: 13 February 2019
PDF Full-text (5287 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
This research reports the ongoing design of the HISMACITY (Historical Small Smart City) Protocol, a planning tool with a certification system. The tool is designed for small municipalities in Europe. Through the award-winning certification system, the Protocol supports the fulfillment of best practices. [...] Read more.
This research reports the ongoing design of the HISMACITY (Historical Small Smart City) Protocol, a planning tool with a certification system. The tool is designed for small municipalities in Europe. Through the award-winning certification system, the Protocol supports the fulfillment of best practices. Such practices can enhance town attractiveness. It also counteracts excessive land use that results from urban growth, and reduces demographic decline in internal areas of each country. The research methodology is grounded on building a dynamic dataset using geo big data, local data, and mobile data via information communications technology (ICT), and real-time data through sensors. The tool aims to build algorithms to calculate indicators that measure quality standards of integrated interventions. The aim is to reach specific goals within defined priority areas of the Historical Small Smart City Protocol. Being highly adaptive, the framework follows urban responsive design principles based on weighted suitability models that can be calibrated by changing the input data and the weights of the linear combination formula. The results highlight varying framework data, including the tool’s development procedures and practicality. Full article
(This article belongs to the Special Issue Big Data Challenges in Smart Cities)
Figures

Graphical abstract

Data EISSN 2306-5729 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top