Wireless Edge Computing: Enabling Technologies for the Next Generation of Cloud Computing

A special issue of Information (ISSN 2078-2489). This special issue belongs to the section "Information and Communications Technology".

Deadline for manuscript submissions: closed (31 May 2023) | Viewed by 19860

Special Issue Editors


E-Mail Website
Guest Editor
Department of Telecommunications Engineering, Universidade Federal Fluminense (UFF), Niterói 24210-240, Brazil
Interests: cloud computing; knowledge extraction; natural language processing; wireless networking; network security

E-Mail Website
Co-Guest Editor
Department of Telecommunications Engineering, Universidade Federal Fluminense (UFF), Niterói 24210-240, Brazil
Interests: cloud computing; knowledge extraction; natural language processing; wireless networking; network security

Special Issue Information

Dear Colleagues,

The development of ever-faster wireless data networks and the evolution of cloud computing technologies are enablers of the Internet of Things, Industry 4.0, and Smart Cities. High-speed wireless access to resources in the cloud promotes several evolutions in industry and social relationships. Edge computing provides computing and storage resources close to end-users, usually within or at the edge of telecommunication operators’ networks, favoring the execution of critical latency applications that require high throughput. However, the next generation of wireless edge computing technologies is challenging when it comes to providing high bandwidth, low latency, extending the reach of access networks, providing a high connection density at access points, and ensuring adequate energy consumption.

This Special Issue seeks contributions reporting on recent advancements concerning wireless network technologies, mobile edge computing, knowledge extraction from the network, and unattended aerial vehicle (UAV) networks. This includes novel technologies to deploy wireless access networks to edge computing applications and discussion about the deployment of novel applications and frameworks. Moreover, the Special Issue also considers knowledge-based applications to deploy and manage next-generation wireless networks and cloud computing environments. Topics of interest include but are not limited to:

  • Trends and challenges for wireless networks and edge computing;
  • Wireless networks for cloud and edge computing;
  • Data analysis in cloud and edge computing;
  • Resource management and task scheduling of cloud and edge computing;
  • Data allocation optimization in cloud and edge computing;
  • Channel-aware computation offloading for wireless edge computing;
  • Edge computing for content-centric wireless networking;
  • Edge computing for mobility management in wireless networks;
  • Network functions virtualization (NFV) and network “softwarization”;
  • Security and privacy-preserving techniques for cloud and edge computing;
  • Performance evaluation of edge computing platforms.

Dr. Diogo Mattos
Guest Editor

Dr. Dianne Medeiros
Co-Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Information is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Cloud computing
  • Wireless networking
  • Mobile edge computing

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

25 pages, 3129 KiB  
Article
Edge and Fog Computing Business Value Streams through IoT Solutions: A Literature Review for Strategic Implementation
by Nikolaos-Alexandros Perifanis and Fotis Kitsios
Information 2022, 13(9), 427; https://doi.org/10.3390/info13090427 - 11 Sep 2022
Cited by 4 | Viewed by 4606
Abstract
Edge–fog computing and IoT have the ability to revolutionize businesses across all sectors and functions, from customer engagement to manufacturing, which is what makes them so fascinating and emerging. On the basis of research methodology by Webster and Watson (2020), 124 peer-reviewed articles [...] Read more.
Edge–fog computing and IoT have the ability to revolutionize businesses across all sectors and functions, from customer engagement to manufacturing, which is what makes them so fascinating and emerging. On the basis of research methodology by Webster and Watson (2020), 124 peer-reviewed articles were discussed. According to the literature, these technologies lead to reduced latency, costs, bandwidth, and disruption, but at the same time, they improved response time, compliance, security and greater autonomy. The results of this review revealed the open issues and topics which call for further research/examination in order for edge–fog computing to unveil new business value streams along with IoT capabilities for the organizations. Only by adopting and implementing precisely these revolutionary will new solutions organizations succeed in the digital transformation of the modern era. Despite the fact that they are cutting-edge solutions to business operations and knowledge creation, there are still practical implementation issues to be dealt with and a lack of experience in the strategic integration of the variable architectures, which hinder efforts to generate business value. Full article
Show Figures

Figure 1

27 pages, 2313 KiB  
Article
RAMi: A New Real-Time Internet of Medical Things Architecture for Elderly Patient Monitoring
by Olivier Debauche, Jean Bertin Nkamla Penka, Saïd Mahmoudi, Xavier Lessage, Moad Hani, Pierre Manneback, Uriel Kanku Lufuluabu, Nicolas Bert, Dounia Messaoudi and Adriano Guttadauria
Information 2022, 13(9), 423; https://doi.org/10.3390/info13090423 - 7 Sep 2022
Cited by 22 | Viewed by 4828
Abstract
The aging of the world’s population, the willingness of elderly to remain independent, and the recent COVID-19 pandemic have demonstrated the urgent need for home-based diagnostic and patient monitoring systems to reduce the financial and organizational burdens that impact healthcare organizations and professionals. [...] Read more.
The aging of the world’s population, the willingness of elderly to remain independent, and the recent COVID-19 pandemic have demonstrated the urgent need for home-based diagnostic and patient monitoring systems to reduce the financial and organizational burdens that impact healthcare organizations and professionals. The Internet of Medical Things (IoMT), i.e., all medical devices and applications that connect to health information systems through online computer networks. The IoMT is one of the domains of IoT where real-time processing of data and reliability are crucial. In this paper, we propose RAMi, which is a Real-Time Architecture for the Monitoring of elderly patients thanks to the Internet of Medical Things. This new architecture includes a Things layer where data are retrieved from sensors or smartphone, a Fog layer built on a smart gateway, Mobile Edge Computing (MEC), a cloud component, blockchain, and Artificial Intelligence (AI) to address the specific problems of IoMT. Data are processed at Fog level, MEC or cloud in function of the workload, resource requirements, and the level of confidentiality. A local blockchain allows workload orchestration between Fog, MEC, and Cloud while a global blockchain secures exchanges and data sharing by means of smart contracts. Our architecture allows to follow elderly persons and patients during and after their hospitalization. In addition, our architecture allows the use of federated learning to train AI algorithms while respecting privacy and data confidentiality. AI is also used to detect patterns of intrusion. Full article
Show Figures

Figure 1

15 pages, 503 KiB  
Article
A Private Strategy for Workload Forecasting on Large-Scale Wireless Networks
by Pedro Silveira Pisa, Bernardo Costa, Jéssica Alcântara Gonçalves, Dianne Scherly Varela de Medeiros and Diogo Menezes Ferrazani Mattos
Information 2021, 12(12), 488; https://doi.org/10.3390/info12120488 - 23 Nov 2021
Cited by 1 | Viewed by 1625
Abstract
The growing convergence of various services characterizes wireless access networks. Therefore, there is a high demand for provisioning the spectrum to serve simultaneous users demanding high throughput rates. The load prediction at each access point is mandatory to allocate resources and to assist [...] Read more.
The growing convergence of various services characterizes wireless access networks. Therefore, there is a high demand for provisioning the spectrum to serve simultaneous users demanding high throughput rates. The load prediction at each access point is mandatory to allocate resources and to assist sophisticated network designs. However, the load at each access point varies according to the number of connected devices and traffic characteristics. In this paper, we propose a load estimation strategy based on a Markov’s Chain to predict the number of devices connected to each access point on the wireless network, and we apply an unsupervised machine learning model to identify traffic profiles. The main goals are to determine traffic patterns and overload projections in the wireless network, efficiently scale the network, and provide a knowledge base for security tools. We evaluate the proposal in a large-scale university network, with 670 access points spread over a wide area. The collected data is de-identified, and data processing occurs in the cloud. The evaluation results show that the proposal predicts the number of connected devices with 90% accuracy and discriminates five different user-traffic profiles on the load of the wireless network. Full article
Show Figures

Figure 1

14 pages, 556 KiB  
Article
Towards Edge Computing Using Early-Exit Convolutional Neural Networks
by Roberto G. Pacheco, Kaylani Bochie, Mateus S. Gilbert, Rodrigo S. Couto and Miguel Elias M. Campista
Information 2021, 12(10), 431; https://doi.org/10.3390/info12100431 - 19 Oct 2021
Cited by 12 | Viewed by 2654
Abstract
In computer vision applications, mobile devices can transfer the inference of Convolutional Neural Networks (CNNs) to the cloud due to their computational restrictions. Nevertheless, besides introducing more network load concerning the cloud, this approach can make unfeasible applications that require low latency. A [...] Read more.
In computer vision applications, mobile devices can transfer the inference of Convolutional Neural Networks (CNNs) to the cloud due to their computational restrictions. Nevertheless, besides introducing more network load concerning the cloud, this approach can make unfeasible applications that require low latency. A possible solution is to use CNNs with early exits at the network edge. These CNNs can pre-classify part of the samples in the intermediate layers based on a confidence criterion. Hence, the device sends to the cloud only samples that have not been satisfactorily classified. This work evaluates the performance of these CNNs at the computational edge, considering an object detection application. For this, we employ a MobiletNetV2 with early exits. The experiments show that the early classification can reduce the data load and the inference time without imposing losses to the application performance. Full article
Show Figures

Figure 1

22 pages, 7578 KiB  
Article
Missing Data Imputation in Internet of Things Gateways
by Cinthya M. França, Rodrigo S. Couto and Pedro B. Velloso
Information 2021, 12(10), 425; https://doi.org/10.3390/info12100425 - 17 Oct 2021
Cited by 11 | Viewed by 2939
Abstract
In an Internet of Things (IoT) environment, sensors collect and send data to application servers through IoT gateways. However, these data may be missing values due to networking problems or sensor malfunction, which reduces applications’ reliability. This work proposes a mechanism to predict [...] Read more.
In an Internet of Things (IoT) environment, sensors collect and send data to application servers through IoT gateways. However, these data may be missing values due to networking problems or sensor malfunction, which reduces applications’ reliability. This work proposes a mechanism to predict and impute missing data in IoT gateways to achieve greater autonomy at the network edge. These gateways typically have limited computing resources. Therefore, the missing data imputation methods must be simple and provide good results. Thus, this work presents two regression models based on neural networks to impute missing data in IoT gateways. In addition to the prediction quality, we analyzed both the execution time and the amount of memory used. We validated our models using six years of weather data from Rio de Janeiro, varying the missing data percentages. The results show that the neural network regression models perform better than the other imputation methods analyzed, based on the averages and repetition of previous values, for all missing data percentages. In addition, the neural network models present a short execution time and need less than 140 KiB of memory, which allows them to run on IoT gateways. Full article
Show Figures

Figure 1

9 pages, 379 KiB  
Article
Decentralized Offloading Strategies Based on Reinforcement Learning for Multi-Access Edge Computing
by Chunyang Hu, Jingchen Li, Haobin Shi, Bin Ning and Qiong Gu
Information 2021, 12(9), 343; https://doi.org/10.3390/info12090343 - 25 Aug 2021
Cited by 2 | Viewed by 1887
Abstract
Using reinforcement learning technologies to learn offloading strategies for multi-access edge computing systems has been developed by researchers. However, large-scale systems are unsuitable for reinforcement learning, due to their huge state spaces and offloading behaviors. For this reason, this work introduces the centralized [...] Read more.
Using reinforcement learning technologies to learn offloading strategies for multi-access edge computing systems has been developed by researchers. However, large-scale systems are unsuitable for reinforcement learning, due to their huge state spaces and offloading behaviors. For this reason, this work introduces the centralized training and decentralized execution mechanism, designing a decentralized reinforcement learning model for multi-access edge computing systems. Considering a cloud server and several edge servers, we separate the training and execution in the reinforcement learning model. The execution happens in edge devices of the system, and edge servers need no communication. Conversely, the training process occurs at the cloud device, which causes a lower transmission latency. The developed method uses a deep deterministic policy gradient algorithm to optimize offloading strategies. The simulated experiment shows that our method can learn the offloading strategy for each edge device efficiently. Full article
Show Figures

Figure 1

Back to TopTop