Machine Learning in the Era of Computing and Network Integration

A special issue of Photonics (ISSN 2304-6732). This special issue belongs to the section "Optical Communication and Network".

Deadline for manuscript submissions: closed (30 November 2021) | Viewed by 6590

Special Issue Editors


E-Mail Website
Guest Editor
Smart Internet Lab, University of Bristol, Bristol BS8 1TL, UK
Interests: machine learning; dynamic optical networks; 5G and beyond; multi-dimensional networks; software-defined networks
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
State Key Laboratory of Information Photonics and Optical Communications, Beijing University of Posts and Telecommunications, Beijing 100876, China
Interests: cloud and edge networking; NFV; optical network security; F5G

E-Mail
Guest Editor
University of California Davis, Davis, CA 95616, USA
Interests: optical networks; software-defined networking; network resilience; optimization; machine learning for networking

Special Issue Information

Dear Colleagues,

The developing concepts in networks, such as software-defined networking (SDN), virtual network embodiments, network function virtualization (NFV), and end-to-end orchestration, have increased the complexity of network management significantly, to a level that reaches far beyond what the current network management tools can offer. These challenges are more apparent in 5G and 5G bearer networks. On the other hand, computing capabilities have become critical in 5G architectures to enable low-latency communications and edge computing. Beyond network applications, network functions, such as RU, DU, and CU in the ORAN-based 5G-RAN solution, will run on the available computing resources without the limits of the dedicated hardware. Therefore, the deep integration of networking and computation has become a common goal to support emerging internet applications and functions. The latest advancements in machine-learning (ML) technologies provide a plethora of tools to drive the innovation of networks in terms of network management and network operations. It is believed that more advanced machine-learning technologies will be deployed in networks to enable revolutions in network management and operations, with the available computation resources. In this Special Issue, we will focus on ML-enabled networking and computation and intend to explore novel networking and computation resource provision, dynamic network operations, and self-managed networks to pave the way toward autonomous networks.

Dr. Shuangyi Yan
Prof. Dr. Yongli Zhao
Dr. Xiaoliang Chen
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Photonics is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • ML-aided network performance monitoring, telemetry services
  • Novel data plane technologies, network infrastructure designs
  • Network survivability, availability-aware service provisioning
  • Resource virtualization in cloud and network integration
  • Service orchestration and control architecture
  • SDN controller embedded with AI engine
  • Cloud computing and edge computing cooperation
  • AI assistant network operation method
  • AI-based failure diagnosis algorithm
  • Autonomous routing crossing multi-domain networks
  • Interoperability and optimization between computing resources and transport resources
  • Security in cloud and network integration.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

15 pages, 3386 KiB  
Article
TDTS: Three-Dimensional Traffic Scheduling in Optical Fronthaul Networks with Conv-LSTM
by Bowen Bao, Zhen Xu, Chao Li, Zhengjie Sun, Sheng Liu and Yunbo Li
Photonics 2021, 8(10), 451; https://doi.org/10.3390/photonics8100451 - 18 Oct 2021
Cited by 2 | Viewed by 1612
Abstract
Given the more intensive deployments of emerging Internet of Things applications with beyond-fifth-generation communication, the access network becomes bandwidth-hungry to support more kinds of services, requiring higher resource utilization of the optical fronthaul network. To enhance resource utilization, this study novelly proposed a [...] Read more.
Given the more intensive deployments of emerging Internet of Things applications with beyond-fifth-generation communication, the access network becomes bandwidth-hungry to support more kinds of services, requiring higher resource utilization of the optical fronthaul network. To enhance resource utilization, this study novelly proposed a three-dimensional traffic scheduling (TDTS) scheme in the optical fronthaul network. Specifically, large and mixed traffic data with multiple different requirements were firstly divided according to three-dimensions parameters of traffic requests, i.e., arriving time, transmission tolerance delay, and bandwidth requirements, forming eight types of traffic model. Then, historical traffic data with division results were put into convolutional-long short-term memory (Conv-LSTM) strategy for traffic prediction, obtaining a clear traffic pattern. Next, the traffic processing order was supported by a priority evaluation factor that was measured by traffic status of the link and network characteristics comprehensively. Finally, following the priority, the proposed TDTS scheme assigned the resource to traffic requests according to their results of traffic division, prediction, and processing order with the shortest path routing and first-fit spectrum allocation policies. Simulation results demonstrated that the proposed TDTS scheme, on the premise of accurate traffic prediction, could outperform conventional resource-allocation schemes in terms of blocking probability and resource utilization. Full article
(This article belongs to the Special Issue Machine Learning in the Era of Computing and Network Integration)
Show Figures

Figure 1

Review

Jump to: Research

9 pages, 849 KiB  
Review
Modeling EDFA Gain: Approaches and Challenges
by Yichen Liu, Xiaomin Liu, Lei Liu, Yihao Zhang, Meng Cai, Lilin Yi, Weisheng Hu and Qunbi Zhuge
Photonics 2021, 8(10), 417; https://doi.org/10.3390/photonics8100417 - 29 Sep 2021
Cited by 9 | Viewed by 3838
Abstract
With the rapid development of virtual/augmented reality and cloud services, the capacity demand for optical communication systems is ever-increasing. To further increase system capacity, current researches focus on efficient and reliable system management, in which the transmission performance should be accurately estimated. The [...] Read more.
With the rapid development of virtual/augmented reality and cloud services, the capacity demand for optical communication systems is ever-increasing. To further increase system capacity, current researches focus on efficient and reliable system management, in which the transmission performance should be accurately estimated. The wavelength-dependent gain effects of erbium-doped fiber amplifiers (EDFAs) have a great impact on transmission performance, and therefore a precise EDFA gain model is required. In this paper, we firstly summarize the underlying principles and structures of EDFA, and introduce the gain performance and challenges in modeling. Then, we review the EDFA gain modeling methods. We categorize these researches into analytical modeling methods and machine learning (ML)-based modeling methods, and discuss their feasibilities and performances. In addition, we discuss the remaining problems for applying the models in a system and the possible directions for future investigations. Full article
(This article belongs to the Special Issue Machine Learning in the Era of Computing and Network Integration)
Show Figures

Figure 1

Back to TopTop