sensors-logo

Journal Browser

Journal Browser

Intelligent Internet of Thing, Sensor, and AR/VR Technology for Smart Cities

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensor Networks".

Deadline for manuscript submissions: closed (31 August 2022) | Viewed by 7344

Special Issue Editors

School of Computer Science and Engineering, Korea University of Technology and Education, Chungnam, Korea
Interests: reinforcement learning; sensor and actuator networks; mobile sensor networks; intelligent IoT and CPS systems; smart cities
Special Issues, Collections and Topics in MDPI journals
Department of Computer Science, University of Texas at Dallas, Richardson, TX, USA
Interests: human–computer interaction; haptics; virtual reality; wearable computing
Protocol Engineering Laboratory, Department of Computer and Information Security, Sejong University, Seoul 05006, Republic of Korea
Interests: authentication; privacy; mobility management; protocol analysis
Special Issues, Collections and Topics in MDPI journals
PRECISE Center, School of Engineering and Applied Science, University of Pennsylvania, Philadelphia, PA, USA
Interests: software-defined networks; network function virtualization; quality of service; mobility support in heterogeneous networks
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Today, there are 37 megacities with more than 10 million citizens and 220 big cities with more than 1 million citizens. Therefore, efficient and effective management and control of traffic congestion, energy consumption, waste, and pollution are resulting in severe economic and social implications for the citizens of those cities. The recent technology convergence of Internet of Things (IoT)/sensor devices, their networks, and the augmented reality/virtual reality (AR/VR) using them has been creating new demands and opportunities to deal with those problems in smart cities. In particular, the growing number of heterogeneous IoT, sensor, and actuator devices in addition to the demanding wired/wireless communications in their integrated systems impose several challenges as both traditional research problems and new innovative research issues related to smart cities. Moreover, human and non-human systems based on recent AR/VR technology interact consistently and allow each to achieve a set of non-trivial smart interaction in smart cities. This Special Issue aims to gather recent advanced technologies and applications that address intelligent IoT/sensor devices’ low latency networking, context-aware interaction, energy efficiency, resource management, security, human–robot interaction, assistive technology and robots, application development, and the integration of multiple systems that support smart interaction in smart cities. High-quality, original research papers covering key issues and topics of smart cities will be considered for publication. Accordingly, this Special Issue will bring together researchers in both industry and academia in highlighting the appropriate current and future directions in the areas.

Topics for this Special Issue include but are not limited to:

* Human-centric context management for interaction in smart cities

* IoT/sensor management for advanced (mobile) ubiquitous applications for smart cities

* IoT/sensor applications and middleware support for smart cities

* Smart low latency and real-time IoT/sensor communication

* IoT/sensor energy efficiency

* Distributed sensing, actuation, control, and coordination

* Network resilience, fault-tolerance, and reliability

* Privacy, security, and trust management of IoT/sensor devices

* IoT-based blockchain technology and applications for smart cities

* Intelligent resource management for smart cities

* Functional computation and data aggregation

* Distributed algorithms and intelligent network optimization for smart cities

* Measurements from experimental systems for smart cities

* Prototypes, testbeds, and real-world deployments of intelligent systems for smart cities

* Smart AR/VR interaction for smart cities

* Immersive haptic interaction for smart cities

* Tactile internet for smart cities

* AR/VR devices and technologies integrated with IoT/sensors for smart cities

* Reasoning and learning techniques for human-centric interaction in smart cities

Prof. Dr. Youn-Hee Han
Dr. Jin Ryong Kim
Dr. Hyon-Young Choi
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:
25 pages, 5199 KiB  
Article
Recognizing and Counting Freehand Exercises Using Ubiquitous Cellular Signals
by Guanlong Teng, Yue Xu, Feng Hong, Jianbo Qi, Ruobing Jiang, Chao Liu and Zhongwen Guo
Sensors 2021, 21(13), 4581; https://doi.org/10.3390/s21134581 - 04 Jul 2021
Viewed by 2048
Abstract
Freehand exercises help improve physical fitness without any requirements for devices or places. Existing fitness assistant systems are typically restricted to wearable devices or exercising at specific positions, compromising the ubiquitous availability of freehand exercises. In this paper, we develop MobiFit, a contactless [...] Read more.
Freehand exercises help improve physical fitness without any requirements for devices or places. Existing fitness assistant systems are typically restricted to wearable devices or exercising at specific positions, compromising the ubiquitous availability of freehand exercises. In this paper, we develop MobiFit, a contactless freehand exercise assistant using just one cellular signal receiver placed on the ground. MobiFit passively monitors the ubiquitous cellular signals sent by the base station, which frees users from the space constraints and deployment overheads and provides accurate repetition counting, exercise type recognition and workout quality assessment without any attachments to the human body. The design of MobiFit faces new challenges of the uncertainties not only on cellular signal payloads but also on signal propagations because the sender (base station) is beyond the control of MobiFit and located far away. To tackle these challenges, we conducted experimental studies to observe the received cellular signal sequence during freehand exercises. Based on the observations, we constructed the analytic model of the received signals. Guided by the insights derived from the analytic model, MobiFit segments out every repetition and rest interval from one exercise session through spectrogram analysis and extracts low-frequency features from each repetition for type recognition. Extensive experiments were conducted in both indoor and outdoor environments, which collected 22,960 exercise repetitions performed by ten volunteers over six months. The results confirm that MobiFit achieves high counting accuracy of 98.6%, high recognition accuracy of 94.1% and low repetition duration estimation error within 0.3 s. Besides, the experiments show that MobiFit works both indoors and outdoors and supports multiple users exercising together. Full article
Show Figures

Figure 1

12 pages, 3269 KiB  
Article
Compound Context-Aware Bayesian Inference Scheme for Smart IoT Environment
by Ihsan Ullah, Ju-Bong Kim and Youn-Hee Han
Sensors 2022, 22(8), 3022; https://doi.org/10.3390/s22083022 - 14 Apr 2022
Cited by 4 | Viewed by 1279
Abstract
The objective of smart cities is to improve the quality of life for citizens by using Information and Communication Technology (ICT). The smart IoT environment consists of multiple sensor devices that continuously produce a large amount of data. In the IoT system, accurate [...] Read more.
The objective of smart cities is to improve the quality of life for citizens by using Information and Communication Technology (ICT). The smart IoT environment consists of multiple sensor devices that continuously produce a large amount of data. In the IoT system, accurate inference from multi-sensor data is imperative to make a correct decision. Sensor data are often imprecise, resulting in low-quality inference results and wrong decisions. Correspondingly, single-context data are insufficient for making an accurate decision. In this paper, a novel compound context-aware scheme is proposed based on Bayesian inference to achieve accurate fusion and inference from the sensory data. In the proposed scheme, multi-sensor data are fused based on the relation and contexts of sensor data whether they are dependent or not on each other. Extensive computer simulations show that the proposed technique significantly improves the inference accuracy when it is compared to the other two representative Bayesian inference techniques. Full article
Show Figures

Figure 1

13 pages, 13987 KiB  
Technical Note
Three-Dimensional Engine-Based Geometric Model Optimization Algorithm for BIM Visualization with Augmented Reality
by Pa Pa Win Aung, Woonggyu Choi, Almo Senja Kulinan, Gichun Cha and Seunghee Park
Sensors 2022, 22(19), 7622; https://doi.org/10.3390/s22197622 - 08 Oct 2022
Cited by 4 | Viewed by 2040
Abstract
Building information modeling (BIM), a common technology contributing to information processing, is extensively applied in construction fields. BIM integration with augmented reality (AR) is flourishing in the construction industry, as it provides an effective solution for the lifecycle of a project. However, when [...] Read more.
Building information modeling (BIM), a common technology contributing to information processing, is extensively applied in construction fields. BIM integration with augmented reality (AR) is flourishing in the construction industry, as it provides an effective solution for the lifecycle of a project. However, when applying BIM to AR data transfer, large and complicated models require large storage spaces, increase the model transfer time and data processing workload during rendering, and reduce visualization efficiency when using AR devices. The geometric optimization of the model using mesh reconstruction is a potential solution that can reduce the required storage while maintaining the shape of the components. In this study, a 3D engine-based mesh reconstruction algorithm that can pre-process BIM shape data and implement an AR-based full-size model is proposed, which is likely to increase the efficiency of decision making and project processing for construction management. As shown in the experimental validation, the proposed algorithm significantly reduces the number of vertices, triangles, and storage for geometric models while maintaining the overall shape. Moreover, the model elements and components of the optimized model have the same visual quality as the original model; thus, a high performance can be expected for BIM visualization in AR devices. Full article
Show Figures

Figure 1

Back to TopTop