sensors-logo

Journal Browser

Journal Browser

Special Issue "Robotic Sensing for Smart Cities"

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensor Networks".

Deadline for manuscript submissions: 31 December 2020.

Special Issue Editors

Prof. Dr. Hyun Myung
Website
Guest Editor
Korea Advanced Institute of Science and Technology, Daejeon, Korea
Interests: Autonomous Robot Navigation; SLAM (Simultaneous Localization And Mapping); Structural Inspection Robot; Machine Learning; AI
Dr. Yang Wang
Website
Co-Guest Editor
Associate Professor, School of Civil and Environmental Engineering & School of Electrical and Computer Engineering (Adjunct) Georgia Institute of Technology 790 Atlantic Dr NW, Atlanta, GA 30332-0355, USA
Interests: structural health monitoring and damage detection, decentralized structural control, wireless and mobile sensors and structural dynamics

Special Issue Information

Dear Colleague,

For several decades, various sensors and sensing systems have been developed for smart cities or civil infrastructure systems. This Special Issue focuses on state-of-the-art robotics and automation technologies, including construction automation, robotics, instrumentation, monitoring, inspection, control, and rehabilitation for smart cities. It also covers construction informatics supporting sensing, analysis, and design activities needed to construct and operate smart and sustainable built environment.

Examples include robotic systems applied to smart cities equipped with various sensing technologies, such as vision-based sensors, laser sensors, wireless sensors, multi-sensor fusion, etc. Service robots and robotized devices for increasing the usability and safety of a built environment are also good examples.

A Special Issue in these areas will be published in the Sensors journal in an effort to diffuse current advances of various robotics and automation technologies for smart cities. Papers should contain theoretical and/or experimental results and will be subject to formal review procedures.

The particular topics of interest include but are not limited to:

  • Robotic sensing systems for smart cities or civil infrastructure;
  • Bio-inspired sensing for smart cities or civil infrastructure;
  • IT and robotics for construction automation or construction management;
  • Structural health monitoring or inspection using robotics technologies;
  • Mobile sensor networks for smart cities or civil infrastructure;
  • Damage repair and emergency handling control for smart cities or civil infrastructure;
  • Service robots and robotized devices for built environment.

Prof. Dr. Hyun Myung
Dr. Yang Wang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Robotic sensing
  • Smart cities
  • Civil infrastructure
  • Structural health monitoring
  • Inspection robots
  • Mobile sensor network
  • Service robots

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Drivers’ Visual Perception Quantification Using 3D Mobile Sensor Data for Road Safety
Sensors 2020, 20(10), 2763; https://doi.org/10.3390/s20102763 - 12 May 2020
Abstract
To prevent driver accidents in cities, local governments have established policies to limit city speeds and create child protection zones near schools. However, if the same policy is applied throughout a city, it can be difficult to obtain smooth traffic flows. A driver [...] Read more.
To prevent driver accidents in cities, local governments have established policies to limit city speeds and create child protection zones near schools. However, if the same policy is applied throughout a city, it can be difficult to obtain smooth traffic flows. A driver generally obtains visual information while driving, and this information is directly related to traffic safety. In this study, we propose a novel geometric visual model to measure drivers’ visual perception and analyze the corresponding information using the line-of-sight method. Three-dimensional point cloud data are used to analyze on-site three-dimensional elements in a city, such as roadside trees and overpasses, which are normally neglected in urban spatial analyses. To investigate drivers’ visual perceptions of roads, we have developed an analytic model of three types of visual perception. By using this proposed method, this study creates a risk-level map according to the driver’s visual perception degree in Pangyo, South Korea. With the point cloud data from Pangyo, it is possible to analyze actual urban forms such as roadside trees, building shapes, and overpasses that are normally excluded from spatial analyses that use a reconstructed virtual space. Full article
(This article belongs to the Special Issue Robotic Sensing for Smart Cities)
Show Figures

Figure 1

Open AccessArticle
Deep-Learning-Based Indoor Human Following of Mobile Robot Using Color Feature
Sensors 2020, 20(9), 2699; https://doi.org/10.3390/s20092699 - 09 May 2020
Cited by 2
Abstract
Human following is one of the fundamental functions in human–robot interaction for mobile robots. This paper shows a novel framework with state-machine control in which the robot tracks the target person in occlusion and illumination changes, as well as navigates with obstacle avoidance [...] Read more.
Human following is one of the fundamental functions in human–robot interaction for mobile robots. This paper shows a novel framework with state-machine control in which the robot tracks the target person in occlusion and illumination changes, as well as navigates with obstacle avoidance while following the target to the destination. People are detected and tracked using a deep learning algorithm, called Single Shot MultiBox Detector, and the target person is identified by extracting the color feature using the hue-saturation-value histogram. The robot follows the target safely to the destination using a simultaneous localization and mapping algorithm with the LIDAR sensor for obstacle avoidance. We performed intensive experiments on our human following approach in an indoor environment with multiple people and moderate illumination changes. Experimental results indicated that the robot followed the target well to the destination, showing the effectiveness and practicability of our proposed system in the given environment. Full article
(This article belongs to the Special Issue Robotic Sensing for Smart Cities)
Show Figures

Figure 1

Open AccessArticle
Mobile Robot Self-Localization with 2D Push-Broom LIDAR in a 2D Map
Sensors 2020, 20(9), 2500; https://doi.org/10.3390/s20092500 - 28 Apr 2020
Cited by 1
Abstract
This paper proposes mobile robot self-localization based on an onboard 2D push-broom (or tilted-down) LIDAR using a reference 2D map previously obtained with a 2D horizontal LIDAR. The hypothesis of this paper is that a 2D reference map created with a 2D horizontal [...] Read more.
This paper proposes mobile robot self-localization based on an onboard 2D push-broom (or tilted-down) LIDAR using a reference 2D map previously obtained with a 2D horizontal LIDAR. The hypothesis of this paper is that a 2D reference map created with a 2D horizontal LIDAR mounted on a mobile robot or in another mobile device can be used by another mobile robot to locate its location using the same 2D LIDAR tilted-down. The motivation to tilt-down a 2D LIDAR is the direct detection of holes or small objects placed on the ground that remain undetected for a fixed horizontal 2D LIDAR. The experimental evaluation of this hypothesis has demonstrated that self-localization with a 2D push-broom LIDAR is possible by detecting and deleting the ground and ceiling points from the scan data, and projecting the remaining scan points in the horizontal plane of the 2D reference map before applying a 2D self-location algorithm. Therefore, an onboard 2D push-broom LIDAR offers self-location and accurate ground supervision without requiring an additional motorized device to change the tilt of the LIDAR in order to get these two combined characteristics in a mobile robot. Full article
(This article belongs to the Special Issue Robotic Sensing for Smart Cities)
Show Figures

Graphical abstract

Open AccessArticle
Human Interaction Smart Subsystem—Extending Speech-Based Human-Robot Interaction Systems with an Implementation of External Smart Sensors
Sensors 2020, 20(8), 2376; https://doi.org/10.3390/s20082376 - 22 Apr 2020
Cited by 1
Abstract
This paper presents a more detailed concept of Human-Robot Interaction systems architecture. One of the main differences between the proposed architecture and other ones is the methodology of information acquisition regarding the robot’s interlocutor. In order to obtain as much information as possible [...] Read more.
This paper presents a more detailed concept of Human-Robot Interaction systems architecture. One of the main differences between the proposed architecture and other ones is the methodology of information acquisition regarding the robot’s interlocutor. In order to obtain as much information as possible before the actual interaction took place, a custom Internet-of-Things-based sensor subsystems connected to Smart Infrastructure was designed and implemented, in order to support the interlocutor identification and acquisition of initial interaction parameters. The Artificial Intelligence interaction framework of the developed robotic system (including humanoid Pepper with its sensors and actuators, additional local, remote and cloud computing services) is being extended with the use of custom external subsystems for additional knowledge acquisition: device-based human identification, visual identification and audio-based interlocutor localization subsystems. These subsystems were deeply introduced and evaluated in this paper, presenting the benefits of integrating them into the robotic interaction system. In this paper a more detailed analysis of one of the external subsystems—Bluetooth Human Identification Smart Subsystem—was also included. The idea, use case, and a prototype, integration of elements of Smart Infrastructure systems and the prototype implementation were performed in a small front office of the Weegree company as a decent test-bed application area. Full article
(This article belongs to the Special Issue Robotic Sensing for Smart Cities)
Show Figures

Figure 1

Open AccessArticle
Accessible Real-Time Surveillance Radar System for Object Detection
Sensors 2020, 20(8), 2215; https://doi.org/10.3390/s20082215 - 14 Apr 2020
Cited by 1
Abstract
As unmanned ground and aerial vehicles become more accessible and their usage covers a wider area of application, including for threatening purposes which can cause connected catastrophe, a surveillance system for the public places is being considered more essential to respond to those [...] Read more.
As unmanned ground and aerial vehicles become more accessible and their usage covers a wider area of application, including for threatening purposes which can cause connected catastrophe, a surveillance system for the public places is being considered more essential to respond to those possible threats. We propose an inexpensive, lighter, safer, and smaller radar system than military-grade radar systems while keeping reasonable capability for use in monitoring public places. The paper details the iterative process on the system design and improvements with experiments to realize the system used for surveillance. The experiments show the practical use of the system and configuration for a better understanding of using the system. Cyber-physical systems for outdoor environments can benefit from the system as a sensor for sensing objects as well as monitoring. Full article
(This article belongs to the Special Issue Robotic Sensing for Smart Cities)
Show Figures

Figure 1

Back to TopTop