Next Article in Journal
Reinforcement Learning in Education: A Literature Review
Previous Article in Journal
Sp2PS: Pruning Score by Spectral and Spatial Evaluation of CAM Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research Trends in the Use of Machine Learning Applied in Mobile Networks: A Bibliometric Approach and Research Agenda

by
Vanessa García-Pineda
1,
Alejandro Valencia-Arias
2,*,
Juan Camilo Patiño-Vanegas
3,
Juan José Flores Cueto
4,
Diana Arango-Botero
3,
Angel Marcelo Rojas Coronel
5 and
Paula Andrea Rodríguez-Correa
6
1
Facultad de Ingeniería, Corporación Universitaria Americana, Medellin 055428, Colombia
2
Escuela de Ingeniería Industrial, Universidad Señor de Sipán, Chiclayo 14001, Peru
3
Facultad de Ciencias Económicas y Administrativas, Instituto Tecnológico Metropolitano, Medellin 050034, Colombia
4
Unidad de Virtualización Académica, Universidad de San Martin de Porres, Santa Anita 15011, Peru
5
Escuela de Ingeniería Mecánica, Universidad Señor de Sipán, Chiclayo 14001, Peru
6
Centro de Investigaciones, Institución Universitaria Escolme, Medellin 050012, Colombia
*
Author to whom correspondence should be addressed.
Informatics 2023, 10(3), 73; https://doi.org/10.3390/informatics10030073
Submission received: 3 February 2023 / Revised: 19 May 2023 / Accepted: 7 June 2023 / Published: 9 September 2023
(This article belongs to the Section Machine Learning)

Abstract

:
This article aims to examine the research trends in the development of mobile networks from machine learning. The methodological approach starts from an analysis of 260 academic documents selected from the Scopus and Web of Science databases and is based on the parameters of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. Quantity, quality and structure indicators are calculated in order to contextualize the documents’ thematic evolution. The results reveal that, in relation to the publications by country, the United States and China, who are competing for fifth generation (5G) network coverage and are responsible for manufacturing devices for mobile networks, stand out. Most of the research on the subject focuses on the optimization of resources and traffic to guarantee the best management and availability of a network due to the high demand for resources and greater amount of traffic generated by the many Internet of Things (IoT) devices that are being developed for the market. It is concluded that thematic trends focus on generating algorithms for recognizing and learning the data in the network and on trained models that draw from the available data to improve the experience of connecting to mobile networks.

1. Introduction

The availability of the current network has become an aspect of interest and a central concern for all individuals and organizations [1]. Mobile networks allow people to be constantly connected, regardless of where and when they are, especially if they are outdoors. Despite the progress that currently exists in regard to hardware and software development, there are still shortcomings in the availability of mobile connectivity services [2]. Cellular telephone operators do not yet have complete network expansion, and the antennas used in some places do not achieve enough range for total signal coverage [3]. In addition, future communication networks must address the scarce spectrum in order to adapt to the great growth in heterogeneous wireless devices. In this sense, efforts are being made to address the coexistence of the spectrum, improve knowledge concerning it, reinforce scheme authentication to improve spectrum monitoring and spectrum management and enable secure communications, among other things [4].
However, software-defined networks (SDNs), network function virtualization (NFV) and cloud computing are receiving significant attention in fifth generation (5G) networks [5]. With the advancement of different optimization techniques, methods and tools (such as artificial intelligence, machine learning and big data), it is possible to improve the availability, quality and coverage of mobile networks, thereby facilitating better service provision to users of mobile networks [2]. Future smart wireless networks require an adaptive learning approach towards a shared learning model to enable collaboration between data generated by network elements and virtualized functions [6].
Additionally, 5G communication networks aim to provide a paradigm shift of the wireless spectrum at higher frequencies to meet the demands of large traffic volumes, extreme transmission speed, low traffic latency and massive connectivity; this paradigm shift places human society in a new service model that is prepared for the demands of the Internet of Things (IoT) and provides a better model of data processing through the use of edge servers to improve the protection of data privacy [7].
Given the advances and great demand in connectivity that society currently demands due to different factors, such as the IoT, smart cities and the growth in wearable devices (wearables), the better operation and management of networks through algorithms that learn and use available data and measurements to optimize network performance is needed [3]. This requirement can be achieved by implementing technologies such as AI, machine learning and big data; however, applying this type of approach to planning, designing, managing and operating networks is still in its early stages, because existing network architectures do not adapt to the networks enabled for these technologies [8].
Based on the above, the main objective of this article is to examine the machine learning-based research trends regarding the development of mobile networks by performing a bibliometric analysis of the Scopus database. A main conclusion is that the main trends in the use of Industry 4.0 technologies, such as machine learning, involve designing and developing algorithms that can learn from the data provided by network traffic and that allow the current and next-generation networks to self-manage in such a way that they respond to the needs and demands of users and the different types of devices connected to the network.
In addition, with the purpose of guiding the development of the research and achieving the stated objective, this study also outlines the following research questions:
RQ1: In which years was there the greatest interest in mobile networks and machine learning?
RQ2: What are the main research references in the scientific literature?
RQ3: How has the literature on mobile networks and machine learning conceptually evolved?
RQ4: What are the main growing and emerging themes derived from the scientific production on mobile networks and machine learning?
RQ5: What elements should a researcher include in their future work on mobile networks and machine learning?
In this regard, this research is composed of the introductory section above, which explains in detail the conceptualization and importance of the topic, as well as the objective and the respective research questions. Then, the Materials and Methods section will detail the necessary procedures to achieve the results that support the investigation. Subsequently, the bibliometric results of the research are developed, as well as a thematic discussion and research trends. The final section outlines the conclusions reached in this article.
For the reader’s ease, the definitions of the abbreviations used in the text are presented in Table 1.

2. Materials and Methods

In accordance with the objective of the research, a methodological design is proposed, which, based on the minimum parameters established by the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement for literature reviews, enables a bibliometric analysis that allows, in accordance with [9], the mapping of the scientific literary body currently available in the databases. This is carried out in order to identify, on the one hand, the current research trends around the proposed topic and, on the other hand, the future directions that can be seen, in both a statistical analysis and positional keywords, as the main metadata topics of scientific activity. In this sense, regarding the execution of the methodological design, the studies by [10,11] are followed, whereby the items or detailed parameters established by the PRISMA declaration are related and specified in the following subsections.

2.1. Eligibility Criteria

The eligibility criteria, as evidenced in [10], are those that enable the specification and the detailing of the characteristics which the studies analyzed in the literature review process must meet. Therefore, for the methodological design used in this study, the following inclusion and exclusion criteria were established.

2.1.1. Inclusion Criteria

As the stipulated in the inclusion criteria for this literature review, the research to be analyzed must involve, with their titles and keywords as the primary bibliographic metadata, the different combinations by which mobile connection networks are known, in combination with the subject of machine learning. This inclusive combination refers to the Boolean operator “AND”.
Likewise, to gain a further understanding the scope of the review, inclusion criteria are established that account for aspects such as the type of publication, the publication status and metadata registration, as well as the rigor of each of these, specifically with regard to the inclusion of journal articles or book chapters that contain complete metadata for a holistic analysis of the information.

2.1.2. Exclusion Criteria

Therefore, in a complementary way, there are exclusion criteria that, in accordance with the reference items of the PRISMA statement, involve different phases. The first is known as screening, whereby all bibliographic records that do not correspond to the format stipulated by the inclusion criteria are excluded. Conference proceedings are generally excluded. Likewise, research that contains incomplete metadata are eliminated, as are all other papers that do not contain defined evaluation criteria, so that the results can guarantee scientific rigor. Then, there is a second exclusion phase, called eligibility. Although they were not excluded in the first phase, all papers that do not involve relevant elements of analysis are eliminated or excluded based on the thematic approach raised in the research.

2.2. Information Sources

After the eligibility criteria of the literature review are defined, in accordance with the protocol established by the PRISMA guidelines, a phase occurs during which one defines the sources of information for the methodological design. In this sense, with the understanding that all literature reviews are based on secondary sources of information and that the scope of this research consists of a review of scientific literature, the two current main academic and scientific databases, namely, the Web of Science and Scopus, are established as sources of information, as they are complete, robust, detailed and rigorous suppliers of metadata, involving publications and scientific activity from institutions, which are found in the bibliography, aided by the interface developed by the databases, thus improving the performance of the review processes [12].
However, as demonstrated in [10], the phase in which the sources of information are established is contained by the time period that is used in the search for scientific information. Thus, it must be mentioned that the present review process dates from the first articles that are available within each database up to the present time; therefore, while a defined interval is not evident, the evolutionary process that the literary and scientific body of international research undergoes regarding use of machine learning for connecting to mobile networks can be understood.

2.3. Search Strategy

To improve the levels of rigor, detail and replicability of the methodological design, the PRISMA guidelines suggest the search strategy that will be executed for each of the information sources selected for the research. Therefore, regarding the details of this section, it is essential to mention that the strategy responds to two interconnected elements: the interface of the information source and the inclusion criteria described. In this sense, it is understood that since the relevant databases are international, their interface requires the realization of a metadata search in English. For this, the following specialized search equations are designed that include all the inclusion criteria described as well as the inherent search characteristics of the database:
Web of Science Search:
((TI = ({Mobile Telecommunication System} OR {Mobile Network} OR {5G Mobile Communication System} OR {Cellular Network} OR {5G mobile} OR {6G mobile}) AND TI = ({Machine learning}))) OR ((AK = ({Mobile Telecommunication System} OR {Mobile Network} OR {5G Mobile Communication System} OR {Cellular Network} OR {5G mobile} OR {6G mobile}) AND AK = ({Machine learning})))
Scopus Search:
((TITLE ({Mobile Telecommunication System} OR {Mobile Network} OR {5G Mobile Communication System} OR {Cellular Network} OR {5G mobile} OR {6G mobile}) AND TITLE ({Machine learning}))) OR ((KEY ({Mobile Telecommunication System} OR {Mobile Network} OR {5G Mobile Communication System} OR {Cellular Network} OR {5G mobile} OR {6G mobile}) AND KEY ({Machine learning})))

2.4. Study Record

After the specific search strategy was applied within each of the databases selected as sources of information, 1871 documents published during the time interval of 2006 to 2022 were obtained. Of these documents, 1468 were from the Scopus database, and the remaining 403 were from the Web of Science database. However, according to the PRISMA referencing guidelines, Refs. [10,11] suggest providing a detailed outline of our use of these documents. Thus, we provide the following information.

2.4.1. Data Management

In order to manage the bibliographic data obtained from the selected information sources, Microsoft Excel® Office tools and the free access software VOSviewer version 1.6.19, through which all the metadata obtained were stored and processed, were utilized. This data treatment necessarily alludes to information homogenization, whereby the information derived from both databases was standardized so that both types of information had the same format; this standardization started with the elimination of duplicate records and the consolidation of a unified database.

2.4.2. Selection Process

Once the information was stored in the aforementioned tools and the process of eliminating duplicate records was carried out, a unified database was designed, which was provided to two of the authors of this research, who independently reviewed the application of the exclusion criteria established at the beginning of the methodological design. This independent analysis guarantees a reduction in the bias that can be derived from the selective analysis of information [10,11]. Finally, to increase the level of detail and replicability of the methodological design, we provide Figure 1, which summarizes all the steps described by the PRISMA statement for literature reviews.

3. Results

Based on the results obtained, the data from the different publications were analyzed, starting with the number of publications per year. Figure 2 shows how the number of publications began to increase in 2017, maintaining this trend in 2018, before experiencing a significant increase in both 2019 and 2020. In 2019, among the published articles with the most citations, the work of [13], with 664 citations, is the most cited paper on the subject of mobile networks and machine learning. In their work, the authors perform a survey in which they examine deep learning and wireless mobile networks in order to study the degree of acceptance of said technologies. With 315 citations, a proposal of an antenna arrangement modulating frequencies by applying MIMO (multiple inputs, multiple outputs) for mobile networks [14] had the next highest number of citations.
In 2020, the publication with the highest number of 272 citations was a study that carried out a survey on multi-access edge computing in 5G, the technology used, the fundamentals and basic concepts and the integration of different technologies [15]. The research with the second-highest number of citations, at 160 citations, comprised study that analyzed the challenges that the industry faces regarding the advancement of technologies based on deep learning for 6G networks [16]. In 2021, an article [16] with 206 citations was published, which similarly analyzed the path toward sixth generation (6G) networks in the future [17]. In the year 2022, one of the most cited articles, with 37 citations, focused on the development of a scheme to hide reversible data in 5G systems by using deep learning techniques [18].
Regarding the main journals, in Figure 3, the IEEE Access journal is shown to have the most publications, accounting for 38% of the total publications on the subject of machine learning and mobile networks, with 98 publications on the subject. In addition, it is positioned as the journal with the highest scientific impact in terms of citations, with more than 2190 citations of its scientific publications. Its most cited publications are [13,19], with 650 citations, and the research carried out was a survey in which the applications of deep reinforcement learning for mobile communication networks was analyzed.
In addition, another of the main journals in the field of research is IEEE Communications Surveys and Tutorials, which, although it is not positioned as a leading journal in terms of scientific productivity, since it has only 11 publications associated with the subject, these are highly relevant to the field, judging from the total number of citations. Among the main contributions of this journal to the scientific body, is a study that delves into the existing knowledge concerning the relationship between deep learning and mobile wireless networks, concluding with the best adoption techniques [13].
Regarding the most highly contributing countries on the subject of machine learning in mobile networks, Figure 4 shows that the United States has the highest number of publications, with 60 publications and more than 2300 citations, making it the main country for scientific publications on the subject. Among the most relevant articles that have been published in the United States are [14,20]. The latter examined the internet of industrial using a cyber-physical systems approach. The next most prolific country is the United Kingdom, with 1572 citations in 38 publications, including two articles that discuss deep learning in mobile and wireless networks and carry out analyses via surveys [13,15], both have the most citations from among articles from the United Kingdom.
In third place is China, with 31 publications. One of these publications has the most citations of any publication thus far with 650 citations. This work comprises the analysis of various applications of deep reinforcement learning in communication systems and networks [19]. Another work, which has 180 citations thus far, examines the use of blockchain and deep reinforcement learning to enhance 5G smart grids [21]. Spain is the next country with the most publications, 21 in total, with more than 360 citations, of which the work with the most citations is a review of the transition from 4G to 5G and the application of machine learning in mobile networks [22].
In addition, the present bibliometric analysis examines the institutions that have led scientific research regarding the use of machine learning in mobile networks, as shown in Figure 5, evaluating scientific productivity as well as academic impact. Here, it can be observed that the University of Oulu stands out, ranking as the institution that publishes the most in this field, as well as being the institution with the third most cited publications.
Important research studies have emerged from this institution, exploring the foundations of three-dimensional wireless cellular networks in unmanned aerial vehicles [23], addressing the management of large volumes of data in the telecommunications context, proposing perspectives of efficiency and performance in mobile networks [24], as well as other studies that have focused on the characteristics of THz wireless systems, which are essential for the development and understanding of wireless systems [25].
Next is the Imperial College London, which, although not prominent in terms of scientific productivity, has had a high impact through the number of citations, making its contributions relevant to the scientific literature on the subject. This institution has delved into aspects such as the use of deep learning in wireless networks [13], as well as the use of deep learning for the management and orchestration of virtual network functions [13].
Likewise, the Beijing University of Posts and Telecommunications stands out among the main institutions publishing scientific literature [26]. However, it stands out due to its significant scientific productivity, which demonstrates its commitment to knowledge generation, even though it is currently not among the institutions with the highest impact in terms of citations.
This institution has produced different research approaches, ranging from the study of algorithms based on a scalable Gaussian process for wireless traffic prediction [27], the evolution of non-orthogonal multiple access techniques that enable the discussion of 6G technology [28], as well as approaches based on machine learning for the flexible programming of transmission time intervals in coexistence with eMBB and uRLLC as one of the main challenges in managing mobile network resources [29].
Figure 6 presents the main authors publishing on this subject. The most cited researcher on the subject is Debbah, M., with 674 citations in total; this author is also the most scientifically productive, with eight publications concerning machine learning and mobile networks. His most cited work, with 239 citations in total, is “Wireless Network Intelligence at the Edge”, in which the authors explored the main components of edge machine learning to propose different divisions of neural networks for intelligent wireless networks [30]; additionally, a co-author of this article is one of the other most prolific authors on this subject, namely Bennis, M., with six publications and 650 citations in total. Then, Haddadi, H.; Patras, P.; and Zhang, C., are the next most cited authors, with 664 citations on their work entitled “Deep Learning in Mobile and Wireless Networking: A Survey” [13]. Chan, K.; He, T.; Leung, K.K.; Makaya, C; and Salonidis, T., also research networking and are the next most cited authors with 518 citations; their most cited work is “Enabling Massive IoT Towards 6G: A Comprehensive Survey” [31].
Regarding keyword patterns, Figure 7 shows the validity and frequency of the keywords according to their recurrence by year. The most frequent and current terms are shown in Quadrant 1 of Figure 7. Three terms are associated with the advancement of 5G and 6G networks because the accelerated growth of mobile networks has allowed the generation of new versions and modifications for the connectivity of mobile devices in cellular networks, in addition to advances that increase both the supply and demand in connectivity resources [32]. Research on 5G networks has focused on optimizing the resources of this network, mainly to guarantee availability and quality in the provision of services for IoT devices [33] in addition to the correct addressing for transmitting and receiving a signal. In these studies focused on the improvement of 5G networks, the terms “artificial intelligence”, “Internet of Things” and “deep learning” were used, which are also the next three most common terms in this quadrant.
For artificial intelligence, the different techniques of this technology are focused on improving the management of mobile networks by defining characteristics for 5G network software with efficient, agile and mainly autonomous and cognitive management [34]. For example, one of the applications of artificial intelligence (AI) and machine learning (ML) is that machine learning can comprise self-operating communication systems [35]. The application of this technology has generated important network paradigms, such as heterogeneous networks (HetNets) [36], ultra-dense networks, radio access networks (RAN) and technologies, such as terahertz (THz) in wireless networks and reconfigurable intelligent surfaces (RIS) [37] for spectrum management [38], user associations, routing optimizations, channel estimation, equalization and energy efficient network management and security [39]. Another application for AI in mobile networks is the design of hierarchical incentive mechanisms for federated machine learning. This application is mainly implemented using an IoT approach designed for the mobile applications of crowd detection in relation to reception and connectivity that IoT devices require [40]. Thus, the recent advances that have been achieved since the implementation of AI and ML in different communication systems have opened the doors and created a way for these two technologies to become fundamental parts of the design of 5G and 6G mobile network systems [41,42] as well as integrating MIMO into signal processing; thus, these processing capabilities are improved by facilitating additional functions required for the demand in mobile networks [43].
However, deep learning (DL) is a technique for implementing ML through the use of multilayer artificial neural networks (ANNs) [44]. This technique has allowed research that facilitates the use of DL as a deep learning technique that enables the capturing of network traffic data and thus allows the intelligent control of network traffic [45,46] by applying different techniques of deep learning, such as variations in neural network algorithms or LSTM (long short-term memory) [47]. Deep learning applications in mobile networks are broad; these applications range from monitoring, quality improvement, management and design, among others. One of the applications that has been studied recently is the use of semi-supervised machine learning to detect anomalies in cellular networks [48]. AI-based computer vision has also been studied using DL in 6G wireless networks, which use DL algorithms to solve different problems when recognizing images and objects via AI in 6G networks [49]. Thus, the use of DL in mobile networks has been considered a possible solution for the high volume of data, traffic management and troubleshooting in mobile networks [13].
However, the term IoT refers to new proposals that allow connection and that can respond to the needs of IoT devices [30]. The other two terms corresponding to wireless communication refer to the type of communication, that is, wireless communication and optimization that corresponds to the built codes of machine learning used to improve the availability of the network, given that the majority of this form of technology and most IoT devices demand high performance and low latency [50]. The term IoT refers to the possibility of connecting different objects, such as equipment with sensors and cards for processing, storage and network capacity, which allows them to be connected to the network [51].
In the second quadrant are the less frequent but more current terms, thus indicating that they emerged in recent years. These terms include Beyond 5G, which suggests that they are networks that go beyond the characteristics of 5G networks; that is, these networks involve more than just communication and data transmission [52]. With 224 citations, one of the most-cited works with the term Beyond 5G (B5G) among its keywords is the research proposed by [53], who presents a new 3D cellular architecture specifically for unmanned aerial vehicles (UAVs) or drones; this architecture is based on a truncated octahedron structure for the cells, the spatial distribution of the equipment and minimizing latency, which resulted in the reuse of the frequency, thus improving the efficiency in the use of drones that are able to connect through wireless mobile networks. The authors of [23] analyze the different technologies and methodologies proposed in the literature examining the integration of nonorthogonal multiple access schemes via power domain [54]; this integration allows the connection of multiple users in the same block of resources by multiplexing either frequency or time, by facilitating communication technologies, such as machine learning, and it is expected that they respond to the demands of B5G networks [55].
Another term found in this quadrant is edge computing, which concerns the design of a network in such a way that the central network resources can be transferred to the perimeter in order to obtain a greater capacity of computing resources, allowing, in this case, the better management of management of IoT (Internet of Things) devices that may require greater capacity and availability of resources [23]. With this type of computing, it is possible to reduce latency by allowing the downloading and execution of certain tasks either locally or remotely, depending on the availability of the network or resources [56]. In terms of the emerging LSTM systems, there has also been work on identifying the best way to optimize the short-term memory of devices to improve the quality of service [57]. The next term is federated learning (FL), which focuses on a machine learning technique which trains models to allow collaboration in a decentralized way in order to build models that maintain data privacy [58]. FL algorithms allow devices to learn a model according to the data provided by an edge server [51]. The devices connected to this server can be found in the same geographical area or distributed in different areas, and the FL model takes the local data provided and thus provides feedback to the devices in the different areas [59].
Continuing with quadrant three, which contains the less frequent and less current terms, there are terms such as “neural networks”, where neural networks are based on biology and allow machines to learn from observing data [57]. Neural networks have a great capacity to learn and solve different situations in signal processing and wireless communication [59]. These consist of an input layer and one or more hidden layers and an output layer, where each layer can contain one or more neurons [57]. These neurons consist of an activation function and several links that connect them with other neurons in different layers [59]. This network tries to find the optimal weight dataset that minimizes the error between the hypothesis function and the labels of the given dataset through forward and backward propagation. This technique can be used for constructing algorithms that enable the efficient management of networks, mainly regarding traffic [60]. For example, one of the techniques that has been used in deep learning algorithms is that of recurrent neural networks (RNNs) for the dynamic distribution of users with respect to traffic and spatially distributing the connection of users from the data obtained by the call log [61]. Another technique, for example, is the feedforward neural network (FFNN), which can be implemented via the use of indicators for managing of the traffic of 5G networks, such as queries for success rate, propagation delay, overall discarded packets, power consumption, bandwidth usage, latency rate and network performance [62]. In this same quadrant, there is the term “energy efficiency”, which refers to the efficient energy use of cellular networks, for example, turning off some of the carrier bands when network traffic is minimal [63].
Another term is “Mmwave”, which refers to millimeter wave communications (mmWave), which is one of the main characteristics of 5G networks and refers to the way in which the beams are positioned allows a better capacity in the gain and thus balance the high propagation and compensate for the loss through the penetration of the millimeter wave bands, minimizing the time for transmission and energy consumption [64]. The term “deep reinforcement learning” is also found in this quadrant, which is a dynamic reservation technique and deep reinforcement learning that, in the case of mobile networks, allows the autonomous management of network resources, oriented according to the requirements of the same, based, for example, on the type of application that demands the availability of resources [65]. The next term is “big data”, which is a technology that has been transformative in the recent developments and research of the latest generations of mobile networks, given that the technologies and techniques that have been recently implemented, developed and studied depend on the data that are being shared and travel through the network, which translates into all the traffic that travels through them [66]. It has also been used to capture network traffic data and use it for marketing purposes for the operators of different cellular networks [67]. The data processed from the use of big data lead to the next term, “resource allocation”, which is about looking for the best storage management. For example, proactive caching is concerned with the storage of data from mobile network traffic from which only specific data can be selected, which provides information on connectivity and user resource demand, thus optimizing storage space [68].
In the last quadrant, which comprises the most frequent but least current terms, the term “reinforcement learning” can be found, which is derived from one of the deep learning techniques already explained as well as network slicing. Reinforcement learning is usually used in communication networks for authenticating and validating devices in the network. These learning algorithms are either parametric, nonparametric, supervised, unsupervised or involve reinforcement learning to facilitate intelligent authentication and make the connection of devices in the network more reliable and profitable [68]. Additionally, reinforcement learning has been applied to the intelligent allocation of resources in networks [69]. Combined with network slicing, algorithms are used alongside the ability to automatically learn patterns in the connection requests of the users in such a way that the routes that are frequently requested can be detected and thus users ca be intelligently distributed along the connection routes and thus greater efficiency in allocating resources in the network can be achieved [70].
However, the main application that has been provided in research on reinforcement learning is assisted caching for heterogeneous mobile networks [71]. The term “network slicing”, which lies between quadrants three and four, refers to network segmentation to address different needs in administrating and orchestrating network services and applications [72]. Based on the above, network slicing can be understood as a technology that allows the wide demand of services in mobile networks running on SDNs in a shared network infrastructure [73]. In this way, one of the main approaches of network slicing is identifying the applications to perform a specific segmentation of the network according to the connected application [74]. This approach is supported by reinforcement learning.
As can be observed in the relationship between the key terms and their recurrence in research for mobile networks and the use of machine learning techniques, most of them focus on the optimization of resources and traffic to guarantee the best management and availability of the network due to the high demand for resources and greater amount of traffic generated by the many IoT devices that are being developed on the market. Figure 8 shows how the aforementioned terms interact, that is, a network of key terms used the different studies has been developed. The main term and central nexus of the network is that of “fifth generation mobile communications” or 5G networks; however, there is already specific talk about 6G networks because 4G networks are currently still in transition. LTE is still working on the automation of 5G networks [75]. Thus, automatic learning algorithms have been implemented to reduce latency in the network, and these algorithms for 5G mainly use different deep learning techniques for network optimization, as seen in the networks formed [76].
The above network of keywords is concerned with improving the quality of service provision of mobile networks from segmentation through network slicing by using deep reinforcement learning (DRL) algorithms for space allocation and division in the network, according to the consumption of resources or the needs of the users in the network [69]. Then, there is a network consisting mainly of the terms “deep reinforcement learning”, “resource allocation” and “massive MIMO”. These terms are connected due to recent research focused on the use of deep learning reinforcement algorithms to improve the allocation of resources in the network and thus guarantee a better special distribution of users; this study is supported by one of the techniques used to modulate the signal and thus appropriately assign the characteristics to the signal according to the needs [77]. Additionally, as revealed in the quadrants of the previous figure, the use of artificial intelligence technology in the latest-generation networks is aimed mainly at guaranteeing the adequate availability of resources due to the massive number of IoT devices that demand considerable resources [78]. Finally, the terms “anomalies”, “computer architecture” and “wireless communications”, which focus on research that proposes a framework of solutions for access to radio communications [79].
As for the most used keywords per year, as Figure 9 reveals, starting with the year 2006, the most used keyword was “UMTS”, which refers to the third generation (3G) of mobile networks, specifically to the universal mobile telecommunications system, which was one of the technologies most used by mobile devices after the year 2000 approximately [80]. At that time, studies were carried out through fuzzy learning for the better management of the mobile network [81]. Then, in 2012, the most commonly used keyword was “SPIM”, which translates as SPAM detection on instant messaging, and the studies carried out focused on how to improve and characterize this type of message to mitigate the overload of the server [82]. In 2014, the term “web browsing” was the most used keyword, focusing efforts on research to model the quality of the browsing experience on the cellular network [83].
For 2015, one of the terms used was “proactive caching”, and the related research carried out focused on the use of big data to improve the proactive perspective of caching to improve the hosting of resources [84]. In 2016, “cellular networks” emerged as the most commonly used key term, where studies focused mainly on finding solutions for network traffic management [63] and methods for dynamic network configuration [85]. However, some studies also emerged that were already beginning to discuss 5G networks [86]. Based on the research carried out in 2016, from 2017 to 2022, “5G” (for mobile networks) emerged as the most used keyword, as scholars began to talk about the virtualization of networks for their more autonomous management [87] and radio-cognitive 5G networks [88]. In 2019, research on 5G began to focus on the use of deep learning algorithms for applications that allowed better quality, mainly a reduction in the amount of latency in the network [89]. In 2020, studies continued to focus on improving deep learning techniques to improve the use of resources, and in these studies, they began to talk about energy efficiency [90]. By 2021, 5G research had begun to discuss the next generation of computing [91] and how mobile networks should be prepared for it along with networks for the IoT and the great demand for network resources [92]. Finally, for the year 2023, in the studies that have been carried out thus far, the most used keyword has been “password security”, involving talk of authentication systems for network security [93].

4. Discussion

The growing interest in the current subject led to increased publications during 2019 and 2020, and the increased publication of research on how to increase the availability of resources in instances where many devices are connected the same network [71] was possibly due to the pandemic caused by the new coronavirus disease 2019 (COVID-19). During this time, smart devices became an integral part of human life, thus requiring highly available and scalable networks, which demand high-speed and real-time response [3].
Regarding key terms, recent research has focused on generating algorithms for recognizing and learning data in the network [6] and trained models start from the available data in order to improve connections to the mobile network, thus improving quality and latency and allowing increasingly faster responses from the network [13]. However, one of the great advantages that this technology is a set of concepts for automated network management, not only to improve the quality of service but also to reduce network management burdens on network administrators [94].
It is essential to mention that the present literature review extracted information from the export processes provided by Scopus and Web of Science databases. The information was then stored, managed and organized using Microsoft Excel® version 2304. This approach ensured proper metadata management, as identified in the results section, as well as in the discussion and conclusion sections. It allowed for the recognition of key research trends, reflected in the prominent keywords throughout the chronological course of the investigative history on machine learning in mobile network connections, including emerging trends in the field. This, in turn, led to the establishment of the previously proposed research agenda (see Figure 10). Consequently, it presents a series of investigative challenges for future work to address the current conceptual gaps and expand knowledge in the field.
Regarding the research questions initially raised in this study, the first question corresponds to the year with the highest productivity in the research field. Although, since 2015, much scientific productivity has been recorded through publications in the area of machine learning and mobile networks, it was in the years 2019 and 2020 where the greatest productivity occurred. This suggests that as Industry 4.0 took on topics such as artificial intelligence and IoT, research associated with the response to the high technological demand began to emerge.
The following question refers to the main references in the scientific literature, i.e., the journals belonging to the IEEE, mainly IEEE Access, are those with the largest number of articles published in the area of machine learning and mobile networks. This is possibly because one of the main areas of knowledge of this magazine is electronic engineering and telecommunications. Regarding the authors, it was Benmis, M., and Debbah, M., who have published the most on the subject and have not only the largest number of publications but also citations.
Regarding the conceptual evolution, starting from 2003, when the universal mobile telecommunications system (UMTS) was still being discussed, along with cellular networks, until 2017, when research began to focus on the security protocols of mobile networks. Subsequently, in 2018, big data was integrated into data analysis through mobile network traffic. In 2019, the application of techniques that allow the optimization of networks, specifically the reinforced learning technique, becomes more relevant. As of 2020, the application of different techniques, tools and mechanisms that allow the stability, security and quality of networks along with the emergence of the Internet of Things and the possible increase in traffic in mobile networks have taken center stage.
The main emerging and growing topics include the Internet of Things, artificial intelligence and the different deep learning techniques applied to the design of mobile networks. Thus, an investigative agenda arises from the application of techniques, such as neural networks and federated learning, and is applied mainly to factors such as network security, Beyond 5G and edge computing.
In addition to the contribution by providing the main research trends in the field of machine learning and mobile networks, this paper provides key elements for the development of future research by creating keyword clusters and a research agenda based on key terms used in previous investigations. Although machine learning and artificial intelligence have been widely addressed by different studies, for future frameworks, the results of this investigation will allow those interested in the subject to know which subject to focus on and to understand what applications to work on, as well as knowing which are the most relevant contributions that have been made on the topic. With the above, this work is useful as a referential framework for future research and the development of degree projects for future professionals who are being trained in wireless communications. Finally, Table 2 shows the 10 articles with the highest number of citations in Scopus. Most of these are review articles where the main techniques and applications of machine learning for wireless networks are discussed.

5. Conclusions

Regarding the number of publications per year, the increased interest in the subject between 2019 and 2020 was due to the pandemic caused by new coronavirus disease 2019 (COVID-19). Regarding the publications by journal, the IEEE Access, being focused on engineering, is the journal predominantly publishing papers whose area of knowledge is entirely focused on engineering. Regarding the main countries, the United States and China, which compete for the coverage of 5G networks, are the ones that are responsible for highest level of manufacturing of devices for mobile networks; thus, it is understood why the research in this area is mainly from these countries.
The progress in the study of mobile networks has been developed as the demand for network resources has increased, which is in response to the large number of IoT devices necessary for the new infrastructure of cities, which is not only intelligent but responsible in terms of the environment. This is how the change in the different types of modulations that have arisen is observed, where it is not only necessary to modulate by frequency and space, moving away from the structure of mobile communications via cells, but also resulting in a communication structure that responds to data traffic and learns according to its flow. Therefore, it can be observed how research on the subject has moved from focusing on the behavior of the signal and the type of antennas needed to focusing on the use of programming codes that allow the best optimization and automation of network resources.
One of the main aspects in recent research is that most have focused on the use of different AI techniques for improvement in resource management and network availability. However, although there is talk of the use of MIMO, for example, for signal processing, little is discussed with respect to the direction and range of the antennas that will be available. However, it would be interesting to focus efforts not only on the development of the optimization and automation techniques for resource management but also on a better directivity and reach in the signal. This could represent one of the greatest challenges in the area of knowledge, since there is constant discussion in relation to the gap that exists between software and hardware advancement in the area of telecommunications.
Consequently, it would be interesting to advance in the proposal of different antenna arrays that allow better directivity and range in high traffic conditions and not only concentrate on the optimization of resources and network availability. Currently, miniaturized antenna designs called micro-ribbon antennas or patch antennas have been proposed through arrays from materials such as FR4 and graphene, which are not only low power but also efficient for use in IoT devices due to their size. This type of antenna would not only be suitable for data transmission in a network, but they would also allow a sustainable IT infrastructure due to the low cost of their design and low energy use. For this type of antenna to be used in next-generation mobile networks, it is still necessary to advance research further; however, there are already advances in scientific papers in which the use of different algorithms of deep learning is demonstrated for the design of this type of antenna in other applications, so it would be possible to improve their directivity and range conditions. New machine learning techniques in patch antenna arrays has not been widely discussed but is a topic that could be covered in the future research in the field.
On the other hand, the 5G mobile networks and the emergence of 6G networks are based on the optimization of algorithms that will allow not only connectivity but also improvement in quality, service availability and speed improvement, in addition to self-management. These characteristics are the focus of future research due to the increasingly accelerated growth of the IoT and its demand for increasingly more bandwidth. For future research on identifying trends in machine learning and mobile networks, it is possible to carry out a systematic literature review, in which criteria can be established that enable the selection of specific articles that, in turn, enable the selection of variables, materials and specific techniques within the subject.

Author Contributions

Conceptualization, V.G.-P. and A.V.-A.; methodology, A.V.-A. and V.G.-P.; software, A.V.-A.; validation, V.G.-P., J.C.P.-V. and D.A.-B.; formal analysis, J.J.F.C.; investigation, P.A.R.-C.; resources, A.M.R.C.; data curation, D.A.-B.; writing—original draft preparation, V.G.-P.; writing—review and editing, A.V.-A.; visualization, J.C.P.-V.; supervision, V.G.-P.; project administration, A.V.-A.; funding acquisition, J.J.F.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Corporación Universitaria Americana (Colombia) and the Universidad Señor de Sipán (Perú). The APC was funded by Universidad Señor de Sipán (Perú).

Informed Consent Statement

Not applicable.

Data Availability Statement

The data may be provided free of charge to interested readers by requesting the correspondence author’s email.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Aryal, B.; Abbas, R.; Collings, I.B. SDN enabled DDoS attack detection and mitigation for 5G networks. J. Commun. 2021, 16, 267–275. [Google Scholar] [CrossRef]
  2. Abdulqadder, I.H.; Zhou, S.; Zou, D.; Aziz, I.T.; Akber, S.M.A. Multi-layered intrusion detection and prevention in the SDN/NFV enabled cloud of 5G networks using AI-based defense mechanisms. Comput. Netw. 2020, 179, 107364. [Google Scholar] [CrossRef]
  3. Abusubaih, M. Intelligent wireless networks: Challenges and future research topics. J. Netw. Syst. Manag. 2022, 30, 18. [Google Scholar] [CrossRef]
  4. Jagannath, A.; Jagannath, J. Multi-task learning approach for modulation and wireless signal classification for 5G and beyond: Edge deployment via model compression. Phys. Commun. 2022, 54, 101793. [Google Scholar] [CrossRef]
  5. Thang, V.V.; Pashchenko, F.F. Multistage system-based machine learning techniques for intrusion detection in WiFi network. J. Comput. Netw. Commun. 2019, 2019, 4708201. [Google Scholar] [CrossRef]
  6. Thantharate, A.; Beard, C. ADAPTIVE6G: Adaptive resource management for network slicing architectures in current 5G and future 6G systems. J. Netw. Syst. Manag. 2022, 31, 9. [Google Scholar] [CrossRef]
  7. Zhang, Z.; Xu, X.; Xiao, F. 5GMEC-DP: Differentially private protection of trajectory data based on 5G-based mobile edge computing. Comput. Netw. 2022, 218, 109376. [Google Scholar] [CrossRef]
  8. Zhao, Y.; Li, Y.; Zhang, X.; Geng, G.; Zhang, W.; Sun, Y. A survey of networking applications applying the software defined networking concept based on machine learning. IEEE Access 2019, 7, 95397–95417. [Google Scholar] [CrossRef]
  9. Dharmani, P.; Das, S.; Prashar, S. A bibliometric analysis of creative industries: Current trends and future directions. J. Bus. Res. 2021, 135, 252–267. [Google Scholar] [CrossRef]
  10. Estarli, M.; Aguilar Barrera, E.S.; Martínez-Rodríguez, R.; Baladia, E.; Agüero, S.D.; Camacho, S.; Buhring, K.; Herrero-López, A.; Gil-González, D.M. Ítems de referencia para publicar protocolos de revisiones sistemáticas y metaanálisis: Declaración PRISMA-P 2015. Rev. Esp. Nutr. Hum. Diet. 2016, 20, 148–160. [Google Scholar] [CrossRef]
  11. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Moher, D. Updating guidance for reporting systematic reviews: Development of the PRISMA 2020 statement. J. Clin. Epidemiol. 2021, 134, 103–112. [Google Scholar] [CrossRef] [PubMed]
  12. Pranckutė, R. Web of science (WoS) and scopus: The titans of bibliographic information in today’s academic world. Publications 2021, 9, 12. [Google Scholar] [CrossRef]
  13. Zhang, C.; Patras, P.; Haddadi, H. Deep learning in mobile and wireless networking: A survey. IEEE Commun. Surv. Tutor. 2019, 21, 2224–2287. [Google Scholar] [CrossRef]
  14. Björnson, E.; Sanguinetti, L.; Wymeersch, H.; Hoydis, J.; Marzetta, T.L. Massive MIMO is a reality—What is next? Five promising research directions for antenna arrays. Digit. Signal Process. 2019, 94, 3–20. [Google Scholar] [CrossRef]
  15. Pham, Q.V.; Fang, F.; Ha, V.N.; Piran, M.J.; Le, M.; Le, L.B.; Hwang, W.J.; Ding, Z. A survey of multi-access edge computing in 5G and beyond: Fundamentals, technology integration, and state-of-the-art. IEEE Access 2020, 8, 116974–117017. [Google Scholar] [CrossRef]
  16. Sharma, S.K.; Wang, X. Toward massive machine type communications in ultra-dense cellular IoT networks: Current issues and machine learning-assisted solutions. IEEE Commun. Surv. Tutor. 2020, 22, 426–471. [Google Scholar] [CrossRef]
  17. Jiang, W.; Han, B.; Habibi, M.A.; Schotten, H.D. The road towards 6G: A comprehensive survey. IEEE Open J. Commun. Soc. 2021, 2, 334–366. [Google Scholar] [CrossRef]
  18. Shajin, F.H.; Rajesh, P. FPGA realization of a reversible data hiding scheme for 5G MIMO-OFDM system by chaotic key generation-based paillier cryptography along with LDPC and its side channel estimation using machine learning technique. J. Circuits Syst. Comput. 2022, 31, 2250093. [Google Scholar] [CrossRef]
  19. Luong, N.C.; Hoang, D.T.; Gong, S.; Niyato, D.; Wang, P.; Liang, Y.C.; Kim, D.I. Applications of deep reinforcement learning in communications and networking: A survey. IEEE Commun. Surv. Tutor. 2019, 21, 3133–3174. [Google Scholar] [CrossRef]
  20. Xu, H.; Yu, W.; Griffith, D.; Golmie, N. A survey on industrial internet of things: A cyber-physical systems perspective. IEEE Access 2018, 6, 78238–78259. [Google Scholar] [CrossRef]
  21. Dai, Y.; Xu, D.; Maharjan, S.; Chen, Z.; He, Q.; Zhang, Y. Blockchain and deep reinforcement learning empowered intelligent 5G beyond. IEEE Netw. 2019, 33, 10–17. [Google Scholar] [CrossRef]
  22. Moysen, J.; Giupponi, L. From 4G to 5G: Self-organized network management meets machine learning. Comput. Commun. 2018, 129, 248–268. [Google Scholar] [CrossRef]
  23. Mozaffari, M.; Kasgari, A.T.Z.; Saad, W.; Bennis, M.; Debbah, M. Beyond 5G with UAVs: Foundations of a 3D wireless cellular network. IEEE Trans. Wirel. Commun. 2018, 18, 357–372. [Google Scholar] [CrossRef]
  24. Baştuğ, E.; Bennis, M.; Zeydan, E.; Kader, M.A.; Karatepe, I.A.; Er, A.S.; Debbah, M. Big data meets telcos: A proactive caching perspective. J. Commun. Netw. 2015, 17, 549–557. [Google Scholar] [CrossRef]
  25. Chaccour, C.; Soorki, M.N.; Saad, W.; Bennis, M.; Popovski, P.; Debbah, M. Seven defining features of terahertz (THz) wireless systems: A fellowship of communication and sensing. IEEE Commun. Surv. Tutor. 2022, 24, 967–993. [Google Scholar] [CrossRef]
  26. Xie, J.; Yu, F.R.; Huang, T.; Xie, R.; Liu, J.; Wang, C.; Liu, Y. A survey of machine learning techniques applied to software defined networking (SDN): Research issues and challenges. IEEE Commun. Surv. Tutor. 2018, 21, 393–430. [Google Scholar] [CrossRef]
  27. Xu, Y.; Yin, F.; Xu, W.; Lin, J.; Cui, S. Wireless traffic prediction with scalable Gaussian process: Framework, algorithms, and verification. IEEE J. Sel. Areas Commun. 2019, 37, 1291–1306. [Google Scholar] [CrossRef]
  28. Liu, Y.; Zhang, S.; Mu, X.; Ding, Z.; Schober, R.; Al-Dhahir, N.; Hossain, E.; Shen, X. Evolution of NOMA toward next generation multiple access (NGMA) for 6G. IEEE J. Sel. Areas Commun. 2022, 40, 1037–1071. [Google Scholar] [CrossRef]
  29. Zhang, J.; Xu, X.; Zhang, K.; Zhang, B.; Tao, X.; Zhang, P. Machine learning based flexible transmission time interval scheduling for eMBB and uRLLC coexistence scenario. IEEE Access 2019, 7, 65811–65820. [Google Scholar] [CrossRef]
  30. Park, J.; Samarakoon, S.; Bennis, M.; Debbah, M. Wireless network intelligence at the edge. Proc. IEEE 2019, 107, 2204–2239. [Google Scholar] [CrossRef]
  31. Guo, F.; Yu, F.R.; Zhang, H.; Li, X.; Ji, H.; Leung, V.C.M. Enabling massive IoT Toward 6G: A comprehensive survey. IEEE Internet Things J. 2021, 8, 11891–11915. [Google Scholar] [CrossRef]
  32. Lopez-Perez, D.; De Domenico, A.; Piovesan, N.; Xinli, G.; Bao, H.; Qitao, S.; Debbah, M. A survey on 5G radio access network energy efficiency: Massive MIMO, lean carrier design, sleep modes, and machine learning. IEEE Commun. Surv. Tutor. 2022, 24, 653–697. [Google Scholar] [CrossRef]
  33. Vukobratovic, D.; Jakovetic, D.; Skachek, V.; Bajovic, D.; Sejdinovic, D.; Karabulut Kurt, G.; Hollanti, C.; Fischer, I. CONDENSE: A reconfigurable knowledge acquisition architecture for future 5G IoT. IEEE Access 2016, 4, 3360–3378. [Google Scholar] [CrossRef]
  34. Yahia, I.G.B.; Bendriss, J.; Samba, A.; Dooze, P. CogNitive 5G networks: Comprehensive operator use cases with machine learning for management operations. In Proceedings of the 2017 20th Conference on Innovations in Clouds, Internet and Networks (ICIN), Paris, France, 7–9 March 2017; IEEE: Paris, France, 2017; pp. 252–259. [Google Scholar]
  35. Bi, Q. Ten trends in the cellular industry and an outlook on 6G. IEEE Commun. Mag. 2019, 57, 31–36. [Google Scholar] [CrossRef]
  36. Kazi, B.U.; Wainer, G.A. Next generation wireless cellular networks: Ultra-dense multi-tier and multi-cell cooperation perspective. Wirel. Netw. 2018, 25, 2041–2064. [Google Scholar] [CrossRef]
  37. Faisal, K.M.; Choi, W. Machine learning approaches for reconfigurable intelligent surfaces: A survey. IEEE Access 2022, 10, 27343–27367. [Google Scholar] [CrossRef]
  38. Gavrilovska, L.; Rakovic, V.; Denkovski, D. From cloud RAN to open RAN. Wirel. Pers. Commun. 2020, 113, 1523–1539. [Google Scholar] [CrossRef]
  39. Mahmood, M.R.; Matin, M.A.; Sarigiannidis, P.; Goudos, S.K. A comprehensive review on artificial intelligence/machine learning algorithms for empowering the future IoT toward 6G era. IEEE Access 2022, 10, 87535–87562. [Google Scholar] [CrossRef]
  40. Lim, W.Y.B.; Xiong, Z.; Miao, C.; Niyato, D.; Yang, Q.; Leung, C.; Poor, H.V. Hierarchical incentive mechanism design for federated machine learning in mobile networks. IEEE Internet Things J. 2020, 7, 9575–9588. [Google Scholar] [CrossRef]
  41. Mao, Q.; Hu, F.; Hao, Q. Deep learning for intelligent wireless networks: A comprehensive survey. IEEE Commun. Surv. Tutor. 2018, 20, 2595–2621. [Google Scholar] [CrossRef]
  42. Viswanathan, H.; Mogensen, P.E. Communications in the 6G era. IEEE Access 2020, 8, 57063–57074. [Google Scholar] [CrossRef]
  43. Wild, T.; Braun, V.; Viswanathan, H. Joint design of communication and sensing for beyond 5G and 6G systems. IEEE Access 2021, 9, 30845–30857. [Google Scholar] [CrossRef]
  44. Elijah, O.; Rahim, S.K.A.; New, W.K.; Leow, C.Y.; Cumanan, K.; Geok, T.K. Intelligent massive MIMO systems for beyond 5G networks: An overview and future trends. IEEE Access 2022, 10, 102532–102563. [Google Scholar] [CrossRef]
  45. Kato, N.; Fadlullah, Z.M.; Mao, B.; Tang, F.; Akashi, O.; Inoue, T.; Mizutani, K. The deep learning vision for heterogeneous network traffic control: Proposal, challenges, and future perspective. IEEE Wirel. Commun. 2017, 24, 146–153. [Google Scholar] [CrossRef]
  46. Mao, B.; Fadlullah, Z.M.; Tang, F.; Kato, N.; Akashi, O.; Inoue, T.; Mizutani, K. Routing or computing? The paradigm shift towards intelligent computer network packet transmission based on deep learning. IEEE Trans. Comput. 2017, 66, 1946–1960. [Google Scholar] [CrossRef]
  47. Zhou, Y.; Fadlullah, Z.M.; Mao, B.; Kato, N. A deep-learning-based radio resource assignment technique for 5G ultra dense networks. IEEE Netw. 2018, 32, 28–34. [Google Scholar] [CrossRef]
  48. Lu, Y.; Wang, J.; Liu, M.; Zhang, K.; Gui, G.; Ohtsuki, T.; Adachi, F. Semi-supervised machine learning aided anomaly detection method in cellular networks. IEEE Trans. Veh. Technol. 2020, 69, 8459–8467. [Google Scholar] [CrossRef]
  49. Kamruzzaman, M.M.; Alruwaili, O. AI-based computer vision using deep learning in 6G wireless networks. Comput. Electr. Eng. 2022, 102, 108233. [Google Scholar] [CrossRef]
  50. Bagchi, S.; Abdelzaher, T.F.; Govindan, R.; Shenoy, P.; Atrey, A.; Ghosh, P.; Xu, R. New frontiers in IoT: Networking, systems, reliability, and security challenges. IEEE Internet Things J. 2020, 7, 11330–11346. [Google Scholar] [CrossRef]
  51. Savazzi, S.; Rampa, V.; Kianoush, S.; Bennis, M. An energy and carbon footprint analysis of distributed and federated learning. IEEE Trans. Green Commun. Netw. 2022, 1, 248–264. [Google Scholar] [CrossRef]
  52. Letaief, K.B.; Chen, W.; Shi, Y.; Zhang, J.; Zhang, Y.J.A. The Roadmap to 6G: AI empowered wireless networks. IEEE Commun. Mag. 2019, 57, 84–90. [Google Scholar] [CrossRef]
  53. Maraqa, O.; Rajasekaran, A.S.; Al-Ahmadi, S.; Yanikomeroglu, H.; Sait, S.M. A survey of rate-optimal power domain NOMA with enabling technologies of future wireless networks. IEEE Commun. Surv. Tutor. 2020, 22, 2192–2235. [Google Scholar] [CrossRef]
  54. Vaezi, M.; Schober, R.; Ding, Z.; Poor, H.V. Non-orthogonal multiple access: Common myths and critical questions. IEEE Wirel. Commun. 2019, 26, 174–180. [Google Scholar] [CrossRef]
  55. Chen, Y.; Bayesteh, A.; Wu, Y.; Ren, B.; Kang, S.; Sun, S.; Xiong, Q.; Qian, C.; Yu, B.; Ding, Z.; et al. Toward the standardization of non-orthogonal multiple access for next generation wireless networks. IEEE Commun. Mag. 2018, 56, 19–27. [Google Scholar] [CrossRef]
  56. Mach, P.; Becvar, Z. Mobile edge computing: A survey on architecture and computation offloading. IEEE Commun. Surv. Tutor. 2017, 19, 1628–1656. [Google Scholar] [CrossRef]
  57. Comșa, I.S.; Trestian, R.; Muntean, G.M.; Ghinea, G. 5MART: A 5G SMART scheduling framework for optimizing QoS through reinforcement learning. IEEE Trans. Netw. Serv. Manag. 2020, 17, 1110–1124. [Google Scholar] [CrossRef]
  58. Shang, X.; Huang, Y.; Liu, Z.; Yang, Y. NVM-enhanced machine learning inference in 6G edge computing. IEEE Trans. Netw. Sci. Eng. 2021; 1, Early Access. [Google Scholar] [CrossRef]
  59. Zhao, S.; Yu, H.; Yu, F.X.; Yang, Q.; Xu, Z.; Xiong, L.; Wang, J.; Vepakomma, P.; Tramèr, F.; Suresh, A.T.; et al. Advances and open problems in federated learning. Found. Trends® Mach. Learn. 2021, 14, 1–210. [Google Scholar] [CrossRef]
  60. Haneda, E.M.K.; Nguyen, S.L.H.; Karttunen, A.; Järveläinen, J.; Bamba, A.; D’Errico, R.; Medbo, J.-N.; Undi, F.; Jaeckel, S.; Iqbal, N.; et al. Millimetre-Wave Based Mobile Radio Access Network for Fifth Generation Integrated Communications (mmMAGIC); European Commission: Brussels, Belgium, 2017; pp. 1–13.
  61. Mazin, A.; Elkourdi, M.; Gitlin, R.D. Accelerating beam sweeping in mmwave standalone 5G new radios using recurrent neural networks. In Proceedings of the 2018 IEEE 88th Vehicular Technology Conference (VTC-Fall), Chicago, IL, USA, 27–30 August 2018; IEEE: Chicago, IL, USA, 2018; pp. 1–4. [Google Scholar]
  62. Aqdus, A.; Amin, R.; Ramzan, S.; Alshamrani, S.S.S.; Alshehri, A.; El-kenawy, E.S.M. Detection collision flows in SDN based 5G using machine learning algorithms. Comput. Mater. Contin. 2023, 74, 1413–1435. [Google Scholar] [CrossRef]
  63. Cao, B.; Fan, J.; Yuan, M.; Li, Y. Toward accurate energy-efficient cellular network: Switching off excessive carriers based on traffic profiling. In Proceedings of the 31st Annual ACM Symposium on Applied Computing, Pisa, Italy, 4–8 April 2016; ACM: New York, NY, USA, 2016; pp. 546–551. [Google Scholar]
  64. Kao, W.C.; Zhan, S.Q.; Lee, T.S. AI-aided 3-D beamforming for millimeter wave communications. In Proceedings of the 2018 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS), Okinawa, Japan, 27–30 November 2018; IEEE: Ishigaki, Japan, 2018; pp. 278–283. [Google Scholar]
  65. Sun, G.; Zemuy, G.T.; Xiong, K. Dynamic reservation and deep reinforcement learning based autonomous resource management for wireless virtual networks. In Proceedings of the 2018 IEEE 37th International Performance Computing and Communications Conference (IPCCC), Orlando, FL, USA, 17–19 November 2018; IEEE: Orlando, FL, USA, 2018; pp. 1–4. [Google Scholar]
  66. Kader, M.A.; Bastug, E.; Bennis, M.; Zeydan, E.; Karatepe, A.; Er, A.S.; Debbah, M. Leveraging big data analytics for cache-enabled wireless networks. In Proceedings of the 2015 IEEE Globecom Workshops (GC Wkshps), San Diego, CA, USA, 6–10 December 2015; IEEE: San Diego, CA, USA, 2015; pp. 1–6. [Google Scholar]
  67. Sundsøy, P.; Bjelland, J.; Iqbal, A.M.; Pentland, A.S.; de Montjoye, Y.A. Big data-driven marketing: How machine learning outperforms marketers’ gut-feeling. In Social Computing, Behavioral-Cultural Modeling and Prediction; Kennedy, W.G., Agarwal, N., Yang, S.J., Eds.; Springer International Publishing: Cham, Switzerland, 2014; pp. 367–374. [Google Scholar]
  68. Fang, H.; Wang, X.; Tomasin, S. Machine learning for intelligent authentication in 5G and beyond wireless networks. IEEE Wirel. Commun. 2019, 26, 55–61. [Google Scholar] [CrossRef]
  69. Chen, G.; Zhang, X.; Shen, F.; Zeng, Q. Two tier slicing resource allocation algorithm based on deep reinforcement learning and joint bidding in wireless access networks. Sensors 2022, 22, 3495. [Google Scholar] [CrossRef]
  70. Rodrigues, T.K.; Kato, N. Network slicing with centralized and distributed reinforcement learning for combined satellite/ground networks in a 6G environment. IEEE Wirel. Commun. 2022, 29, 104–110. [Google Scholar] [CrossRef]
  71. Nomikos, N.; Zoupanos, S.; Charalambous, T.; Krikidis, I. A survey on reinforcement learning-aided caching in heterogeneous mobile edge networks. IEEE Access 2022, 10, 4380–4413. [Google Scholar] [CrossRef]
  72. Narmanlioglu, O.; Zeydan, E. Learning in SDN-based multi-tenant cellular networks: A game-theoretic perspective. In Proceedings of the 2017 IFIP/IEEE Symposium on Integrated Network and Service Management (IM), Lisbon, Portugal, 8–12 May 2017; IEEE: Lisbon, Portugal, 2017; pp. 929–934. [Google Scholar]
  73. Le, L.V.; Lin, B.S.P.; Tung, L.P.; Sinh, D. SDN/NFV, machine learning, and big data driven network slicing for 5G. In Proceedings of the 2018 IEEE 5G World Forum (5GWF), Santa Clara, CA, USA, 9–11 July 2018; IEEE: Silicon Valley, CA, USA, 2018; pp. 20–25. [Google Scholar]
  74. Nakao, A.; Du, P. Toward in-network deep machine learning for identifying mobile applications and enabling application specific network slicing. IEICE Trans. Commun. 2018, E101.B, 1536–1543. [Google Scholar] [CrossRef]
  75. Upadhyay, D.; Tiwari, P.; Mohd, N.; Pant, B. A machine learning approach in 5G user prediction. In ICT with Intelligent Applications. Smart Innovation, Systems and Technologies; Choudrie, J., Mahalle, P., Perumal, T., Joshi, A., Eds.; Springer: Singapore, 2023; pp. 643–652. [Google Scholar]
  76. Paropkari, R.A.; Thantharate, A.; Beard, C. Deep-mobility: A deep learning approach for an efficient and reliable 5G handover. In Proceedings of the 2022 International Conference on Wireless Communications Signal Processing and Networking (WiSPNET), Chennai, India, 24–26 March 2022; IEEE: Chennai, India, 2022; pp. 244–250. [Google Scholar]
  77. Hua, Y.; Li, R.; Zhao, Z.; Chen, X.; Zhang, H. GAN-powered deep distributional reinforcement learning for resource management in network slicing. IEEE J. Sel. Areas Commun. 2020, 38, 334–349. [Google Scholar] [CrossRef]
  78. Rahman, A.; Hasan, K.; Kundu, D.; Islam, M.J.; Debnath, T.; Band, S.S.; Kumar, N. On the ICN-IoT with federated learning integration of communication: Concepts, security-privacy issues, applications, and future perspectives. Future Gener. Comput. Syst. 2023, 138, 61–88. [Google Scholar] [CrossRef]
  79. Koudouridis, G.P.; He, Q.; Dán, G. An architecture and performance evaluation framework for artificial intelligence solutions in beyond 5G radio access networks. EURASIP J. Wirel. Commun. Netw. 2022, 2022, 94. [Google Scholar] [CrossRef]
  80. Dubreil, H.; Altman, Z.; Diascorn, V.; Picard, J.; Clerc, M. Particle swarm optimization of fuzzy logic controller for high quality RRM auto-tuning of UMTS networks. In Proceedings of the 2005 IEEE 61st Vehicular Technology Conference, Stockholm, Sweden, 30 May 2005–1 June 2005; IEEE: Stockholm, Sweden, 2005; pp. 1865–1869. [Google Scholar]
  81. Nasri, R.; Altman, Z.; Dubreil, H. Fuzzy-Q-learning-based autonomic management of macro-diversity algorithm in UMTS networks. Ann. Telecommun. 2006, 61, 1119–1135. [Google Scholar] [CrossRef]
  82. Das, S.; Pourzandi, M.; Debbabi, M. On SPIM detection in LTE networks. In Proceedings of the 2012 25th IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), Montreal, QC, Canada, 29 April–2 May 2012; IEEE: Montreal, QC, Canada, 2012; pp. 1–4. [Google Scholar]
  83. Balachandran, A.; Aggarwal, V.; Halepovic, E.; Pang, J.; Seshan, S.; Venkataraman, S.; Yan, H. Modeling web quality-of-experience on cellular networks. In Proceedings of the 20th Annual International Conference on Mobile Computing and Networking, Maui, HI, USA, 7–11 September 2014; ACM: New York, NY, USA, 2014; pp. 213–224. [Google Scholar]
  84. Mason, F.; Nencioni, G.; Zanella, A. A multi-agent reinforcement learning architecture for network slicing orchestration. In Proceedings of the 2021 19th Mediterranean Communication and Computer Networking Conference (MedComNet), Ibiza, Spain, 15–17 June 2021; IEEE: Ibiza, Spain, 2021; pp. 1–8. [Google Scholar]
  85. Tomoskozi, M.; Seeling, P.; Ekler, P.; Fitzek, F.H.P. Efficiency gain for RoHC compressor implementations with dynamic configuration. In Proceedings of the 2016 IEEE 84th Vehicular Technology Conference (VTC-Fall), Montreal, QC, Canada, 18–21 September 2016; IEEE: Montreal, QC, Canada, 2016; pp. 1–5. [Google Scholar]
  86. Mwanje, S.; Decarreau, G.; Mannweiler, C.; Naseer-ul-Islam, M.; Schmelz, L.C. Network management automation in 5G: Challenges and opportunities. In Proceedings of the 2016 IEEE 27th Annual International Symposium on Personal, Indoor, and Mobile Radio Communications (PIMRC), Valencia, Spain, 4–8 September 2016; IEEE: Valencia, Spain, 2016; pp. 1–6. [Google Scholar]
  87. Jiang, W.; Strufe, M.; Schotten, H.D. Autonomic network management for software-defined and virtualized 5G systems. In Proceedings of the European Wireless 2017-23rd European Wireless Conference, Dresden, Germany, 17–19 May 2017; IEEE: Dresden, Germany, 2017; pp. 1–6. [Google Scholar]
  88. Perez, J.S.; Jayaweera, S.K.; Lane, S. Machine learning aided cognitive RAT selection for 5G heterogeneous networks. In Proceedings of the 2017 IEEE International Black Sea Conference on Communications and Networking (BlackSeaCom), Istanbul, Turkey, 5–8 June 2017; IEEE: Istanbul, Turkey, 2017; pp. 1–5. [Google Scholar]
  89. Sun, D.; Willmann, S. Deep learning-based dependability assessment method for industrial wireless network. IFAC-PapersOnLine 2019, 52, 219–224. [Google Scholar] [CrossRef]
  90. Mughees, A.; Tahir, M.; Sheikh, M.A.; Ahad, A. Towards energy efficient 5G networks using machine learning: Taxonomy, research challenges, and future research directions. IEEE Access 2020, 8, 187498–187522. [Google Scholar] [CrossRef]
  91. Singh, M. Integrating artificial intelligence and 5G in the era of next-generation computing. In Proceedings of the 2021 2nd International Conference on Computational Methods in Science & Technology (ICCMST), Mohali, India, 17–18 December 2021; IEEE: Mohali, India, 2021; pp. 24–29. [Google Scholar]
  92. Kumar, R.; Sinwar, D.; Pandey, A.; Tadele, T.; Singh, V.; Raghuwanshi, G. IoT enabled technologies in smart farming and challenges for adoption. In Internet of Things and Analytics for Agriculture, Volume 3. Studies in Big Data; Pattnaik, P.K., Kumar, R., Pal, S., Eds.; Springer: Singapore, 2021; pp. 141–164. [Google Scholar]
  93. Kirsur, S.M.; Dakshayini, M.; Gowri, M. An effective eye-blink-based cyber secure PIN password authentication system. In Computational Intelligence and Data Analytics. Lecture Notes on Data Engineering and Communications Technologies; Buyya, R., Hernandez, S.M., Kovvur, R.M.R., Sarma, T.H., Eds.; Springer: Singapore, 2022; pp. 89–99. [Google Scholar]
  94. Parera, C.; Redondi, A.E.C.; Cesana, M.; Liao, Q.; Malanchini, I. Transfer learning for channel quality prediction. In Proceedings of the 2019 IEEE International Symposium on Measurements & Networking (M&N), Catania, Italy, 8–10 July 2019; IEEE: Catania, Italy, 2019; pp. 1–6. [Google Scholar]
  95. Klaine, P.V.; Imran, M.A.; Onireti, V.; Souza, R.D. A Survey of Machine Learning Techniques Applied to Self-Organizing Cellular Networks. IEEE Commun. Surv. Tutor. 2017, 19, 2392–2431. [Google Scholar] [CrossRef]
  96. Nawaz, S.J.; Sharma, S.K.; Wyne, S.; Patwary, M.N.; Asaduzzaman, M. Quantum Machine Learning for 6G Communication Networks: State-of-the-Art and Vision for the Future. IEEE Access 2019, 7, 46317–46350. [Google Scholar] [CrossRef]
Figure 1. PRISMA 2020 flow diagram. Our own depiction.
Figure 1. PRISMA 2020 flow diagram. Our own depiction.
Informatics 10 00073 g001
Figure 2. Number of publications per year concerning mobile networks and machine learning. Own graph.
Figure 2. Number of publications per year concerning mobile networks and machine learning. Own graph.
Informatics 10 00073 g002
Figure 3. Main journals on mobile networks and machine learning. Own illustration.
Figure 3. Main journals on mobile networks and machine learning. Own illustration.
Informatics 10 00073 g003
Figure 4. Main countries contributing to research on mobile networks and machine learning. Own graph.
Figure 4. Main countries contributing to research on mobile networks and machine learning. Own graph.
Informatics 10 00073 g004
Figure 5. Main institutions publishing on mobile networks and machine learning. Own graph.
Figure 5. Main institutions publishing on mobile networks and machine learning. Own graph.
Informatics 10 00073 g005
Figure 6. Most prolific authors publishing on the subject of mobile networks and machine learning. Own graph.
Figure 6. Most prolific authors publishing on the subject of mobile networks and machine learning. Own graph.
Informatics 10 00073 g006
Figure 7. Validity quadrants in the area of mobile networks and machine learning. Own graph.
Figure 7. Validity quadrants in the area of mobile networks and machine learning. Own graph.
Informatics 10 00073 g007
Figure 8. Network of keywords on mobile networks and machine learning. Own elaboration.
Figure 8. Network of keywords on mobile networks and machine learning. Own elaboration.
Informatics 10 00073 g008
Figure 9. Keywords by year on the topic of mobile networks and machine learning. Own image.
Figure 9. Keywords by year on the topic of mobile networks and machine learning. Own image.
Informatics 10 00073 g009
Figure 10. Research agenda on the topic of mobile networks and machine learning. Own image.
Figure 10. Research agenda on the topic of mobile networks and machine learning. Own image.
Informatics 10 00073 g010
Table 1. Abbreviations.
Table 1. Abbreviations.
AbbreviationsMeaning
3GThird generation
5GFifth generation
6GSixth generation
AIArtificial intelligence
ANNsArtificial neural networks
UAVsAerial vehicles
B5GBeyond 5G
DRLDeep reinforcement learning
DLDeep learning
FFNNFeedforward neural network
FLFederated learning
HetNetsHeterogeneous networks
KPIsKey performance indicators
LSTMLong short-term memory
IoTInternet of Things
MLMachine learning
mmWaveMillimeter wave communications
MIMOMultiple inputs, multiple outputs
MLMachine learning
NFVNetwork function virtualization
PRISMAPreferred Reporting Items for Systematic Reviews and Meta-Analyses
QCQuantum computing
QMLQuantum ML
RANRadio access networks
RISReconfigurable intelligent surfaces
RNNsRecurrent neural networks
SDNsSoftware-defined networks
THzTerahertz
Table 2. Main techniques and applications discussed in the articles with the greatest impact.
Table 2. Main techniques and applications discussed in the articles with the greatest impact.
NCitationMain ContributionLimitationsMethodologyCitation NumberTechniqueTechnique ApproachApplication
1[13]Based on an exhaustive literature review, the authors provide different options to adapt deep learning models to mobile device networks in general and highlight the different problems to be solved, thereby opening up in-depth research in the field of knowledge of mobile networks and machine learning.Although the work focuses on deep learning, other machine learning techniques used in wireless networks could be compared.Literature Review825Deep LearningDeep Learning-Driven Network-Level Mobile Data Analysis; Deep Learning-Driven App-Level Mobile Data Analysis; Deep Learning-Driven User Mobility Analysis; Deep Learning Driven User Localization; Deep Learning-Driven Wireless Sensor Networks; Deep Learning-Driven Network Control; Deep Learning-Driven Network Security; Deep Learning-Driven Signal Processing; Emerging Deep Learning-Driven Mobile Network ApplicationWireless Networks
2[19]The authors present a detailed review of DRL approaches proposed to address emerging problems in communication networks, such as dynamic network access, data rate control, wireless caching, data offloading, network security and connectivity preservation. Additionally, the authors present DRL applications for traffic routing, resource sharing and data collection.Although the work focuses on deep reinforcement learning, other machine learning techniques used in wireless networks could be compared.Literature Review806Deep Reinforcement LearningDeep Deterministic Policy Gradient Q-Learning for Continuous Action; Deep Recurrent Q-Learning for POMDPs; Deep SARSA Learning; Deep Q-Learning for Markov GamesCommunications Network
3 [14]The authors present five lines of future research related to massive MIMO, digital beamforming and/or
antenna arrays. These five lines focus on proposals for extremely large aperture arrays, holographic massive MIMO, six-dimensional positioning, large-scale MIMO radar and intelligent massive MIMO.
Although the authors provide windows for future research around antenna arrays and massive MIMO. The authors do not consider multiple options regarding antenna array and do not compare MIMO with other techniques; although they talk about the next generation of communications, they do not consider a wide field on 6G communications and dedicate only a small space to it.Literature Review362Machine LearningReinforcement Learning Antenna Arrays
4[15]In this review article, the authors describe up-to-date research on the integration of multi-access edge computing with new technologies to be deployed in 5G.While the authors provide a complete framework on multi-access edge computing features, they focus on applications, needs and features, leaving less room for machine learning techniques applied in current research advances.Literature Review354Machine LearningUnsupervised Learning; Supervised Learning; Reinforcement Learning5G Network
5[17]The authors present an overview of the sixth generation (6G) system based on the following possibilities: usage scenarios, requirements, key performance indicators (KPIs), architecture and enabling technologies, based on the projection of mobile traffic to 2030.While the authors provide a complete framework on the features and a strategic path for 6G, starting from the possible applications, use cases and scenarios, a reduced part is left for the possible machine learning techniques and comparison of the same in 6G.Literature Review323Artificial Intelligence Block Chain; Digital Twins; Intelligent Edge Computing; 6G Network
6[20]Particular challenges present and future research needed in control systems, networks and computing, as well as for the adoption of machine learning in an I-IoT contextThis article focuses on the characteristics of the architecture necessary for IoT, taking into account the possible high traffic demand that will be required to facilitate the connection of these devices. However, it does not focus on machine learning techniques that may allow the best management of networks for the connection of IoT devices.Literature Review315Machine LearningUnsupervised Learning; Supervised Learning; Reinforcement LearningIoT
7 [95]In this research, the authors carry out a general description of the most common machine learning techniques applied to cellular networks, classifying the ML solution applied according to the usage.Different machine learning techniques are discussed; however, only a final reference to deep learning is made, without expanding the possibilities of the application of deep learning techniques.Literature Review299Machine LearningSupervised Learning (k-Nearest Neighbor; Neural Networks; Bayes’ Theory; Support Vector Machine; Decision Trees); Unsupervised Learning (Anomaly Detectors; Self Organizing Maps; K-Means); Reinforcement LearningCellular Networks
8[30]The authors indicate different potentials of cloud-based machine learning (ML) for the architectural deployment of 5G by presenting different case studies and applications that demonstrate the potential of edge ML in 5G.The authors focus their research on edge ML, so the study focuses on providing an overview of future research on edge ML without considering other types of architectures for wireless networks, although they reflect on different types of wireless networks.Literature Review269Machine LearningSupervised Learning (k-Nearest Neighbor; Neural Networks; Bayes’ Theory; Support Vector Machine; Decision Trees); Unsupervised Learning (Anomaly Detectors; Self Organizing Maps; K-Means); Reinforcement LearningWireless Network
9[96] The authors provide a description of the possible enablers of 6G networks from the domain of theoretical elements of machine learning (ML), quantum computing (QC) and quantum ML (QML). The authors propose possible challenges for 6G networks, benefits and usages for applications in Beyond 5G networks.This is a work more focused on the future of communication networks based on the application of quantum computing techniques; therefore, less emphasis is placed on recent use cases of machine learning applied to mobile networks.Literature Review269Quantum Machine LearningSupervised Learning; Semi-supervised and Unsupervised Learning; Reinforcement Learning; Genetic programming; Learning Requirements and Capability; Deep Neural Networks; Deep Transfer Learning; Deep Unfolding; Deep Learning for Cognitive Communications Beyond 5G
10[53]The authors propose a 3D cellular architecture for drone base station network planning and minimum latency cell association for user equipment drones, through a manageable method based on the notion of truncated octahedral shapes, allowing for the complete coverage for a given space with a minimum number of drone base stations.The research is based solely on a proposal for drones that can be replicated for unmanned aerial vehicles; however, it does not consider other mobile equipment.Kernel density estimation266Machine Learning 3D Wireless Cellular Network
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

García-Pineda, V.; Valencia-Arias, A.; Patiño-Vanegas, J.C.; Flores Cueto, J.J.; Arango-Botero, D.; Rojas Coronel, A.M.; Rodríguez-Correa, P.A. Research Trends in the Use of Machine Learning Applied in Mobile Networks: A Bibliometric Approach and Research Agenda. Informatics 2023, 10, 73. https://doi.org/10.3390/informatics10030073

AMA Style

García-Pineda V, Valencia-Arias A, Patiño-Vanegas JC, Flores Cueto JJ, Arango-Botero D, Rojas Coronel AM, Rodríguez-Correa PA. Research Trends in the Use of Machine Learning Applied in Mobile Networks: A Bibliometric Approach and Research Agenda. Informatics. 2023; 10(3):73. https://doi.org/10.3390/informatics10030073

Chicago/Turabian Style

García-Pineda, Vanessa, Alejandro Valencia-Arias, Juan Camilo Patiño-Vanegas, Juan José Flores Cueto, Diana Arango-Botero, Angel Marcelo Rojas Coronel, and Paula Andrea Rodríguez-Correa. 2023. "Research Trends in the Use of Machine Learning Applied in Mobile Networks: A Bibliometric Approach and Research Agenda" Informatics 10, no. 3: 73. https://doi.org/10.3390/informatics10030073

APA Style

García-Pineda, V., Valencia-Arias, A., Patiño-Vanegas, J. C., Flores Cueto, J. J., Arango-Botero, D., Rojas Coronel, A. M., & Rodríguez-Correa, P. A. (2023). Research Trends in the Use of Machine Learning Applied in Mobile Networks: A Bibliometric Approach and Research Agenda. Informatics, 10(3), 73. https://doi.org/10.3390/informatics10030073

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop