Deep Learning for Next-Generation Wireless Networks

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Microwave and Wireless Communications".

Deadline for manuscript submissions: closed (31 July 2023) | Viewed by 9125

Special Issue Editors

DRIVE Laboratory, Bourgogne University, Nevers, France
Interests: B5G networks; network slicing; deep learning; federated deep learning

E-Mail Website
Guest Editor
Department of Electrical and Computer Engineering, Western Washington University, Bellingham, WA 9822, USA
Interests: connected and autonomous vehicles; Internet of Things; information centric networks

Special Issue Information

Dear Colleagues,

Deep learning (DL), including deep supervised learning, deep unsupervised learning, and deep reinforcement learning, has been a key enabler in future wireless networks, including 5G and beyond (B5G) networks. DL techniques can not only exploit the massive quantities of data generated by these networks, but can also efficiently and effectively solve a broad range of problems (e.g., decision making, optimization, and prediction in B5G), without the access of mathematical formulations. This capability is helpful for accelerating the design and deployment of the functionality of such networks, especially for situations when mathematical models are either complex to define, or fail to properly formulate the considered problems. However, the application of DL techniques in wireless networks poses a series of problems, such as data communication, model design, model training, model deployment and security protection, which need to be carefully addressed. Further, newly emerging B5G scenarios such as massive machine-type communications (mMTC), extreme ultra-reliable low-latency communications (EURLLC), Industrial Internet of Things (IIoT), unmanned aerial vehicles (UAVs), and joint communication and sensing (JCS) have brought new challenges and research opportunities for the design and optimization of the functional modules in B5Gs, and there is a great deal of room for research exploring strategies for the development DL based solutions to meet the stringent performance requirements for the delay, reliability, and accuracy of these scenarios.

In this Special Issue, we solicit original, high-quality papers in the field of DL-driven B5Gs, potential topics may include but are not limited to:

▪ DL solutions for medium access control layer functionality, including resource allocation, user association, mobility management, etc.

▪ DL for core network management in B5Gs, including network slicing, virtualization, service deployment/migration, etc.

▪ DL for emerging scenarios in B5Gs, including EURLLC, mMTC, UAV, IIoT, JCS, etc.

▪ DL for fog/edge/cloud computing in B5Gs.

▪ AI-powered energy-efficient network orchestration.

▪ Security and privacy for DL-driven wireless communications.

▪ Testbed, experimental evaluations, and real-world applications of DL techniques in wireless communications.

Dr. Bouziane Brik
Dr. Junaid Ahmed Khan
Prof. Dr. Guangjie Han
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • deep learning
  • 5G networks and beyond
  • IIoT
  • UAVs

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 4627 KiB  
Article
A Novel Hybrid Artificial Bee Colony-Based Deep Convolutional Neural Network to Improve the Detection Performance of Backscatter Communication Systems
by Sina Aghakhani, Ata Larijani, Fatemeh Sadeghi, Diego Martín and Ali Ahmadi Shahrakht
Electronics 2023, 12(10), 2263; https://doi.org/10.3390/electronics12102263 - 16 May 2023
Cited by 22 | Viewed by 1751
Abstract
Backscatter communication (BC) is a promising technology for low-power and low-data-rate applications, though the signal detection performance is limited since the backscattered signal is usually much weaker than the original signal. When the detection performance is poor, the backscatter device (BD) may not [...] Read more.
Backscatter communication (BC) is a promising technology for low-power and low-data-rate applications, though the signal detection performance is limited since the backscattered signal is usually much weaker than the original signal. When the detection performance is poor, the backscatter device (BD) may not be able to accurately detect and interpret the incoming signal, leading to errors and degraded communication quality. This can result in data loss, slow data transfer rates, and reduced reliability of the communication link. This paper proposes a novel approach to improve the detection performance of backscatter communication systems using evolutionary deep learning. In particular, we focus on training deep convolutional neural networks (DCNNs) to improve the detection performance of BC. We first develop a novel hybrid algorithm based on artificial bee colony (ABC), biogeography-based optimization (BBO), and particle swarm optimization (PSO) to optimize the architecture of the DCNN, followed by training using a large set of benchmark datasets. To develop the hybrid ABC, the migration operator of the BBO is used to improve the exploitation. Moving towards the global best of PSO is also proposed to improve the exploration of the ABC. Then, we take advantage of the proposed deep architecture to improve the bit-error rate (BER) performance of the studied BC system. The simulation results demonstrate that the proposed algorithm has the best performance in training the benchmark datasets. The results also show that the proposed approach significantly improves the detection performance of backscattered signals compared to existing works. Full article
(This article belongs to the Special Issue Deep Learning for Next-Generation Wireless Networks)
Show Figures

Figure 1

19 pages, 1255 KiB  
Article
Machine Learning-Based Solutions for Handover Decisions in Non-Terrestrial Networks
by Mwamba Kasongo Dahouda, Sihwa Jin and Inwhee Joe
Electronics 2023, 12(8), 1759; https://doi.org/10.3390/electronics12081759 - 7 Apr 2023
Cited by 9 | Viewed by 3560
Abstract
The non-terrestrial network (NTN) is a network that uses radio frequency (RF) resources mounted on satellites and includes satellite-based communications networks, high altitude platform systems (HAPS), and air-to-ground networks. The fifth generation (5G) and NTN may be crucial in utilizing communication infrastructure to [...] Read more.
The non-terrestrial network (NTN) is a network that uses radio frequency (RF) resources mounted on satellites and includes satellite-based communications networks, high altitude platform systems (HAPS), and air-to-ground networks. The fifth generation (5G) and NTN may be crucial in utilizing communication infrastructure to provide 5G services in the future, anytime and anywhere. Based on the outcome of the Rel-16 study, the 3rd generation partnership project (3GPP) decided to start a work item on an NTN in 5G new radio (NR) Rel-17, and the focus of the study was on mobility management procedures, due to the movements of NTN platforms; especially, low earth orbit (LEO) satellites. Handover enhancements were discussed to tackle the frequent handover due to the fast satellite movement. Therefore, the major problem of handover in LEO satellite systems was the signaling storm created by handing over all user equipment (UE) in a cell to a new cell because when the UE crosses the boundary between the neighboring cell of a satellite, an intra-satellite or cell handover occurs; thus, all users in a cell are expected to experience a change of cell due to handover every few seconds. In addition, UE location is not easy to define due to moving cell/beam situations. In this study, we propose machine learning-based solutions for handover decisions in non-terrestrial networks for cell handovers or intra-satellite handovers to reduce signaling storms during handovers where the handover requests will be executed by clustered users. First, the dataset was generated by the simulator that simulates communication between users and satellites. Second, we preprocessed the data, and also used the feature creation technique to create the distance feature using the Haversine formula, and then applied clustering and classification algorithms. The experimental results show that the distance between a user and its cell center is an important parameter for handover decisions in NTN, and the random forest outperforms all models with a higher accuracy of 99% along with a better F1-score of 0.9961. Full article
(This article belongs to the Special Issue Deep Learning for Next-Generation Wireless Networks)
Show Figures

Figure 1

13 pages, 1096 KiB  
Article
Cost-Aware Bandits for Efficient Channel Selection in Hybrid Band Networks
by Sherief Hashima, Kohei Hatano, Mostafa M. Fouda, Zubair M. Fadlullah and Ehab Mahmoud Mohamed
Electronics 2022, 11(11), 1782; https://doi.org/10.3390/electronics11111782 - 3 Jun 2022
Cited by 5 | Viewed by 1806
Abstract
Recently, hybrid band communications have received much attention to fulfil the exponentially growing user demands in next-generation communication networks. Still, determining the best band to communicate over is a challenging issue, especially in the dynamic channel conditions in multi-band wireless systems. In this [...] Read more.
Recently, hybrid band communications have received much attention to fulfil the exponentially growing user demands in next-generation communication networks. Still, determining the best band to communicate over is a challenging issue, especially in the dynamic channel conditions in multi-band wireless systems. In this paper, we manipulate a practical online-learning-based solution for the best band/channel selection in hybrid radio frequency and visible light communication (RF/VLC) wireless systems. The best band selection difficulty is formulated as a multi-armed bandit (MAB) with cost subsidy, in which the learner (transmitter) endeavors not only to increase his total reward (throughput) but also reduce his cost (energy consumption). Consequently, we propose two hybrid band selection (HBS) algorithms, named cost subsidy upper confidence bound (CSUCB-HBS) and cost subsidy Thompson sampling (CSTS-HBS), to efficiently handle this problem and obtain the best band with high throughput and low energy consumption. Extensive simulations confirm that CSTS-/CSUCB-HBS outperform the naive TS/UCB and heuristic HBS approaches regarding energy consumption, energy efficiency, throughput, and convergence speed. Full article
(This article belongs to the Special Issue Deep Learning for Next-Generation Wireless Networks)
Show Figures

Figure 1

Back to TopTop