Skip to Content
TechnologiesTechnologies
  • Article
  • Open Access

13 March 2026

Open vs. Commercial 5G SA Deployments: Performance Assessment

,
,
,
and
1
Department of Telecommunications, Faculty of Electronics, Telecommunications and Information Technology, National University of Science and Technology Politehnica Bucharest, Splaiul Independenței No. 313, Sector 6, 061071 Bucharest, Romania
2
Department of Development and Innovation, Orange Romania, Matei Millo No. 5, Sector 1, 010144 Bucharest, Romania
*
Authors to whom correspondence should be addressed.

Abstract

Open-source and commercial fifth-generation (5G) deployments are difficult to compare because they are built for different goals and reported under different conditions, which slows down validation and technology transfer from research to practice. This study explores the deployment and evaluation of two 5G Standalone (SA) disaggregated Radio Access Network (RAN) systems, using open-source research RAN, commercial RAN, and Software-Defined Radio (SDR) hardware. The first testbed is a SDR-based prototype, containing a Universal Software Radio Peripheral (USRP) B210 device, using Software Radio System RAN (srsRAN) as the RAN. The commercial-based testbed contains a Benetel RAN550 Radio Unit (RU), connected via an optical fiber to a Commercial Off-the-Shelf (COTS) server acting as the Distributed Unit (DU) and Centralized Unit (CU) using the Accelleran virtualized Baseband Unit (vBBU) platform. The Core Network (CN) is implemented using the open-source Open5GS in both testbeds. To evaluate the network’s functionality, throughput and latency are tracked using a Motorola Edge 50 Pro mobile terminal. The experimental results are analyzed and compared with representative performance metrics reported in the literature to place the measurements in a broader research context. This study further assesses trade-offs related to cost, portability, and scalability by comparing SDR-based research prototypes with commercial deployments.

1. Introduction

Earlier implementations of Radio Access Network (RAN) relied on “closed” architectures, where Software (SW) modules ran on proprietary equipment supplied by a single vendor. This approach limited both performance optimization and the participation of alternative providers. To meet the growing demand for user data and deliver higher quality services, fifth-generation (5G) mobile networks are in the process of adopting Open RAN, a recent concept launched by the Open-Radio Access Network (O-RAN) Alliance in 2018 [1] and supported by the Telecom Infra Project (TIP) [2] and 3rd Generation Partnership Project (3GPP) [3]. This transition implies significant changes in architectural configurations and standards, which should promote vendor independence and competitiveness while maintaining or, desirably, improving the quality of the provided services.
Based on open standard and interfaces, embracing the concept of virtualization and integration of machine learning together with artificial intelligence techniques, Open RAN significantly facilitates experimental research. However, developing a fully compliant Open RAN system, as defined by the O-RAN Alliance, is a complex process. Therefore, while the research interest indicates a clear shift toward the Open RAN paradigm, with a number of testbeds having been proposed, the majority of published works implement just the principal 5G architecture blocks, RAN and Core Network (CN), which is sufficient to test network performance [4,5,6,7,8].
While a number of these implementations explore commercial-grade 5G deployments [9,10], the vast majority of such performance-oriented testbeds, quite understandably, rely on open-source platforms. Thus, the most frequent choices for the RAN realizations are Software Radio Systems RAN (srsRAN) [11], OpenAirInterface (OAI)-RAN [12], BubbleRAN [13], and for the CN—Open5GS [14] or similar frameworks, which provide the means to virtualize mobile networks and create 3GPP-compliant systems on widely available Commercial Off-the-Shelf (COTS) Hardware (HW). The growing diversity of both open and commercial components (Figure 1) has led to a rich variety of such 5G testbeds with disaggregated architecture. However, open-source and commercial deployments are not easy to compare due to the difference in their purposes: open implementations are focused on the testing of the methodology and algorithms under controlled conditions, using quite limited configuration variations and usually a less powerful Radio Unit (RU); commercial-based systems are mostly performance-oriented and are tuned for the particular user case scenario. As a result, there remains a gap that makes it difficult to quantify how the results obtained on open testbeds correspond to, or scale within, industrial deployments.
Figure 1. Comparative 5G architecture components. Structural differences between commercial and open deployments.
This paper is targeted at addressing this gap by implementing two 5G Standalone (SA) architecture versions, open-source and commercial-based, with similar configuration parameters (apart from the bandwidth (BW)); comparing them between each other and with other published results. For the open-source deployment, the setup involves the Universal Software Radio Peripheral (USRP) B210 (Ettus Research Ltd., Austin, TX, USA) as the RU, srsRAN (Release 23.10.1) as the RAN SW, and Open5GS (Release-17 v2.7.0) as the CN. The commercial deployment in this paper preserves “opennes” in terms of disaggregation, employing the same CN, while integrating RAN550 (Benetel Ltd., Dublin, Ireland) as RU, and Accelleran (Antwerpen, Belgium) [15], a commercial RAN SW.
To summarize, the main contributions of this paper are:
  • A literature review and comparative analysis of open and commercial-based 5G SA configurations in terms of throughput and latency results.
  • A replicable setup of two 5G SA prototypes (a fully open system and a fully commercial-based system), deployed under similar laboratory conditions.
  • An assessment of trade-offs between cost, portability, and scalability of prototypes based on Software-Defined Radio (SDR) versus commercial deployments.
The remainder of this paper is organized as follows. Section 2 presents a comparative overview of related works covering both open- and commercial-based fourth-generation (4G) and 5G SA/Non-Standalone (NSA) implementations and highlights how the current study complements them. Section 3 details the implemented disaggregated RAN testbeds and their configuration parameters, also providing a step-by-step guide for the open-source deployment. Section 4 explains the benchmarking performance tests (throughput and latency) carried out to validate the testbeds and summarizes the obtained results. Section 5 reflects on the results and provides discussions on cost, flexibility, and reproducibility of the open- versus commercial-based testbeds. Finally, Section 6 concludes this paper with remarks on the future of the Open RAN deployments.

2. Related Works

This section puts in the spotlight the studies on 5G prototypes that served as a baseline and inspiration for this paper. After a brief overview, the key points are summarized in two tables: Table 1 outlines the 5G architecture components used in the considered articles, and Table 2 compares configurations and throughput/latency measurement results.
Table 1. Breakdown of related works on 5G architecture components used.
Table 2. Breakdown of related works on 5G testbed implementations.

2.1. SDR-Based (Open) Implementations

One of the earliest implementations was published in [16]. It investigates the setup and evaluation of a Long-Term Evolution (LTE) testbed built using OAI, Fraunhofer Core (FC) network, and USRP-2944R (a X310-equivalent SDR hardware at the Council for Scientific and Industrial Research in South Africa). The setup was implemented in an indoor controlled laboratory environment and was focused on throughput/latency validation using iPerf (“Magic iPerf” app)/ping (“Ping Tools”) tests, correspondingly. The results demonstrated that the testbed can support Downlink (DL) and Uplink (UL) throughputs of 30 Mbit/s and 28 Mbit/s, respectively, with an internet access latency of about 14  ms. Despite the testbed being examined in 4G mode, the authors mentioned planned extension to 5G NSA, which received continuation in [17]. While at that time, this implementation was also limited to 4G testing, it was presented as a foundation for upcoming 5G SA testbeds. The system was verified for the User Equipment (UE)–Base Station (BS) connectivity (pass test) and further tested under several practical use cases, which included 4K YouTube streaming at (a) a stable distance between BS and UE (pass test), (b) stable distances ranging from 1 m to 25m (throughput and latency tests), and (c) a dynamic scenario when the UE was moving around the BS (pass test). In cases (a) and (c), the authors proved the system to be successful when the distance between UE and BS was not “too large” (presumably < 25 m). In use case (b), the results showed that for distances up to approximately 15 m , the network maintained DL rates of ≈14 Mbit / s , UL rates of ≈2 Mbit / s , and latency between 23 and 28 ms, ensuring smooth video flow. However, within 20 m to 25 m , throughput declined significantly (DL: 3.8 Mbit/s, UL: 1.4 Mbit/s), and latency dropped to 7 ms, which, according to the large standard deviation, proved that the connections became unreliable.
The authors in [7] presented a comparative analysis of two NSA prototypes implemented on the OAI and srsRAN open-source platforms, respectively. In the OAI-based setup, two USRPs were employed (X310 for the LTE eNodeB and B200mini for the 5G gNodeB), whereas the srsRAN implementation utilized a single X310 SDR with dual Radio Frequency (RF) chains to simultaneously operate both LTE and 5G nodes. The tests were conducted in a controlled indoor laboratory environment and focused on Point-to-point (P2P) throughput (Magic iPerf) and latency (Aruba Utilities) measurements under the Time Division Duplexing (TDD) operation with a 15 kHz Subcarrier Spacing (SCS) for different numbers of Resource Blocks (RBs). The experimental results demonstrated that both implementations achieved throughput values a bit lower but close to the theoretical limits, with srsRAN generally showing higher stability and slightly better performance. The OAI-based prototype was functional only in 4G LTE mode at 25 and 50 RBs, achieving a reasonable throughput in the Single Input Single Output (SISO) configuration. However, it failed at 100 RBs, was not operational in the Multiple Input Multiple Output (MIMO) configuration, and could not establish a functional connection with the 5G part, which, as the authors assumed, was due to UE–network incompatibility.
In [8], the authors deployed a fully virtualized 5G SA testbed based on the containers built using Docker and managed by Kubernetes. RAN and UE in this implementation were emulated using gnbsim. The authors presented two deployments, “minimalist” and “basic”, demonstrating a successful handshake protocol and tested network performance using iPerf tool. The “minimalist” setup incorporates Access and Mobility Management Function (AMF), Session Management Function (SMF), User Plane Function (UPF), and Network Repository Function (NRF). The “basic” setup extends the “minimalist” setup by incorporating the following additional functions: Authentication Server Function (AUSF), Unified Data Repository (UDR) and Unified Data Management (UDM). The “minimalist” setup achieved approximately 248 Mbit/s DL and 252 Mbit/s UL, while the “basic” setup reached 190 Mbit/s on DL and 192 Mbit/s on UL. It could be noticed that the first scenario achieved noticeably higher throughput compared to the “basic” setup. The authors explained it by the increased signaling overhead introduced by the additional cloud-based functions (AUSF, UDM, UDR) in the “basic” configuration.
The dissertation in [18] provides one of the most comprehensive evaluations of open-source 5G SA testbeds to date. The work explores 36 testbeds, which are combinations of different open RAN (OAI-RAN, srsRAN), core software (OAI-5GC, Open5GS), SDR platforms (USRP B210, X410) and UEs (OnePlus Nord CE 2 5G, Quectel 5G with two different laptops). Besides throughput and latency, the study also assesses interoperability and the impact of user location for both single- and multi-UE scenarios. The results of the study show that srsRAN can deliver high UL throughput when running on a powerful host computer and for nearly located UEs. However, when the distance between UE and gNodeB increases, its performance is noticably lower than for the OAI-RAN due to lack of automatic power adjustment. The authors also highligt that UE-agniostic OAI provides more stable results than srsRAN when working with different UEs and over lower end-to-end latency. In Table 1 and Table 2, the results of two testbeds are presented, which exploit OAI-5GC, OAI-RAN/srsRAN and USRP B210 in n78 band with 20 MHz and 40 MHz BW, being the most comparable with the setup provided in this paper.
The work in [19] investigates the capabilities (throughput, latency and coverage) of a 5G SA system based on OAI-5GC, OAI gNodeB and USRP (B210/N310) in a controlled indoor environment. Similarly to the other studies, the work tests the throughput using iPerf for two versions of USRP—B210 and N310—which differ by a maximum supported BW, 56 MHz and 100 MHz, correspondingly. The experimental results show that with the USRP B210 with 40 MHz (106 RBs), the DL reached 126 Mbit/s and the UL 18 Mbit/s. A similar disbalanced situation was observed also with a 60 MHz (162 RBs) configuration on the USRP N310: the system achieved a maximum DL throughput of 390 Mbit/s, while the UL performance remained significantly lower, reaching only 28 Mbit/s compared to the theoretical 120 Mbit/s. Latency tests for both USRPs showed similar results: average Round-Trip Times (RTTs) of around 20 ms, with minimum values close to 7 ms. The work also explored multi-UE mode, adding a second device, and presented the study on coverage based on the Reference Signal Received Power (RSRP). The authors concluded that despite the setup being able to deliver near-theoretical DL performance and acceptable latency in indoor deployments, the UL performance, limited coverage and 19 ms latency highlight the need for further optimization before being used for mission-critical 5G services, such as an Ultra-Reliable Low-Latency Communications (URLLC) slice.
Using a similar setup—a 5G SA network built on top of OAI and USRP—the research in [21] presents performance tests and a detailed tutorial on how to install monolithic OAI. The tests (spectrum occupation, coverage evaluation, and benchmarking tests) were conducted in an indoor laboratory environment using a Motorola G50 UE, evaluating the system’s performance under different BW configurations: 40 MHz, 60 MHz and 100 MHz corresponding to 106, 162, and 273 PRBs, respectively. The throughput and latency measurements were evaluated for channel-ideal and UE-to-UE scenarios. For the purpose of comparison with the research done in this paper, only the results corresponding to the ideal-channel scenario (communication between gNodeB and UE) are included. In this setup, only DL throughput was measured. The obtained throughput DL results reached approximately 148 Mbit/s (max)/136 Mbit/s (mean) for 40 MHz, 215 Mbit/s (max)/208 Mbit/s (mean) for 60 MHz, and  298 Mbit/s (max)/234 Mbit/s (mean) for 100 MHz. The mean RTT ranged between 12 and 14 ms across all bandwidths; however, similarly to the throughput test at 100 MHz, it showed significant instability, with the peak RTT reaching 80 ms. Thus, despite the study generally drawing conclusions regarding the OAI platform’s stability and reproducibility, it also notes increasing variation and higher computational load at high BWs.
The study in [20] presents an experimental open-source 5G SA testbed implemented with srsRAN (gNodeB), Open5GS (5GC), and USRP B210 operating in band n77 ( 3.88 GHz) under a TDD configuration. The evaluation of the testbed (throughput, latency, and coverage) was conducted in an indoor environment in a P2P setup under several variations: the authors experimented with different bandwidths (20 MHz and 40 MHz), antenna configurations (1 × 1 and 2 × 2 MIMO), Modulation and Coding Schemes (MCSs) (64-Quadrature Amplitude Modulation (QAM) and 256-QAM), and TDD frame structures. The best obtained mean throughput was achieved for the 40 MHz setup with 256-QAM, reaching 123 Mbit/s DL (83% of calculated max throughput) and 39 Mbit/s UL (47% of calculated max throughput), while the lowest mean RTT ranged between 15.7 and 17.3 ms, with a minimum of 5.3 ms for the DDSUU pattern.
In addition to individual prototype implementations, a comprehensive overview of the field is presented in [22]. The authors provide a comprehensive theoretical background of the field with examples of the different parts from which one can assemble a 5G network. As for the practical side, the study presents an implementation of two non-commercial 5G SA testbeds at the National University of Science and Technology Politehnica Bucharest (NUSTPB) and the University of Agder (UiA). Both testbeds were built with SDR hardware, srsRAN/OAI—as RAN SW, and Magma, OAI and different version of Open5GS—as a CN. The NUSTPB testbed reported DL throughput of up to 145.6 Mbit/s for, presumably, 40 MHz (n78), while the UiA testbed achieved 187 Mbit/s for 80 MHz (n77) using the same version of srsRAN, B210 USRP and srsRAN or Magma as a CN (was not explicitly specified).

2.2. Commercial-Based Implementations

The work published in [10] stands out among most of the works reviewed above not only because it presents a commercial (Nokia) implementation of a private 5G SA campus network but also due to a more comprehensive performance evaluation. The system is based on Nokia Digital Automation Cloud (NDAC) technology, comprising a 5G Core interconnected with a Baseband Unit (BBU) and indoor and outdoor Remote Radio Heads (RRHs). It operates in band n78 with a 100 MHz bandwidth (3700 MHz to 3800 MHz) under 3GPP Release 15. The performance was evaluated for both indoor and outdoor scenarios using commercial UEs (Huawei P40 Pro, Quectel RM500Q-GL, Telit FN980) and both internal (LibreSpeed) and external (Ookla Speedtest) servers. Correspondingly, the authors analyzed four scenarios: indoor/outdoor UEs tested with internal/external servers. In the configuration most relevant to this study (indoor with internal server), the best measured DL and UL throughputs reached approximately 749/97  Mbit / s for the Quectel RM500Q-GL, 733/233  Mbit / s for the Huawei P40 Pro, and 376/221  Mbit / s for the Telit FN980, while latency values were 11.4 ms, 6.4 ms and 9 ms, correspondingly. As can be observed, the results vary considerably across different user devices, while other testing scenarios introduce an additional degree of variability. Consequently, the authors concluded that further investigation is required, particularly focusing on the performance dependence on the type of UE.
In addition to the previous work, attention is also drawn to some experimental testbeds, e.g., [9], which evaluates real-world 5G deployments (latency) in industrial environments. Overall, the research presents six deployments, which represent different combinations of the three industrial sites with the corresponding use cases, NSA/SA 5G architectures, and mid ( 3.6 GHz)/high (28 GHz) bands. Latency measurements were conducted for a range of packet sizes (100 B to 1500 B) and analyzed at different percentiles of reliability (99% and 99.99%). The results show that URLLC-capable SA systems operating at 28 GHz achieved latencies of around 1 ms one-way. At mid-band 3.6 GHz, for the enhanced Mobile Broadband (eMBB)-focused setup, latency was higher, between 6 and 12 ms, but still met many industrial (but not URLLC) requirements. In Table 1 and Table 2, the deployments considered most relevant for this study are emphasized: SA and NSA architectures at mid-band, focused on workpiece monitoring in an indoor industrial environment (Aachen site).

2.3. Comparative Analysis,  Discussion

For ease of comparison, a summary of the data on the studied literature is provided, as was already mentioned above, in two tables. Table 1 provides a breakdown of the architectural components used in the testbeds as well as UEs; Table 2 breaks down the works in terms of the stated configuration parameters and obtained results. For clarity, certain abbreviations and terminology used in these tables are defined below:
  • RAN HW—RAN hardware or a RU.
  • RAN SW—the software framework used for RAN.
  • CN—the software framework used for CN.
  • Source—the source of the 5G architectural components (open/commercial) used in the testbeds.
  • Test—the type of experimental setup. The majority relied on P2P links; however, there are a few studies extending to user-driven applications like 4K streaming or industrial monitoring.
  • Configuration—summarizes key radio parameters, including operating band, BW or number of RBs if the BW was not specifically mentioned, SCS, and MIMO configuration.
  • Measurement Results—present the reported mean latency (RTT) and DL/UL throughput values.
Considering Table 2, it is important to highlight the challenges related to the overall comparability of the testbeds. One of the primary targets for this literature review was outlining a benchmark throughput and latency for the different combinations of the open/commercial components to create a solid baseline for this study. However, this is not a trivial task since the configuration of such systems requires many parameters, and not all of them are explicitly mentioned in the reviewed papers. Table 2 presents a compilation of the most frequently reported values relevant for the throughput and latency; however, even those might be considered insufficient to provide a full-scale comparison. For example, Sahbafard et al. [19] and Dória et al. [21] present two seemingly identical experiments (OAI, 60 MHz BW, 30 kHz SCS, SISO). However, the results differ significantly: Sahbafard reports nearly twice the DL throughput and higher average latency compared to Doria. This difference may result from several factors, such as variations in the frame structure: Ref. [19] may have employed a more downlink-oriented TDD configuration, whereas Ref. [21] likely used a more balanced allocation. As observed in Table 2, the vast majority of the studies rely on prototype implementations built on open-source platforms, while just a few discuss implementations on commercial equipment since it is expensive and not easy to access, often requiring connections and many permissions. The discrepancy in the results between them is quite significant: for example, considering the testbed published in [22] with BW 80 MHz assuming an increase in the BW and MIMO configuration—even then, a theoretical throughput would at maximum reach half of what was declared in the commercial setup in [10].
To begin addressing these inconsistencies, this study aims to contribute to bridging the gap between open-source testbeds and those incorporating commercial components by implementing both under similar configuration parameters. In addition, a complete set of configuration details and testing procedures is provided to ensure reproducibility and establish a transparent reference for future benchmarking of Open RAN systems.

3. Deployment Setup and Configuration

All measurements were acquired in the Orange Future Networks 6G Lab [23], an advanced experimental facility created in partnership between Orange Romania and the Research Center for Advanced Research on New Materials, Products, and Innovative Processes (CAMPUS) of the NUSTPB university.

3.1. Open-Source SDR-Based 5G SA Testbed

The measurement setup (Figure 2 and Figure 3) involves an ROG Strix G17 (ASUSTeK Computer Inc., Taipei, Taiwan) G713IM-HX005 laptop running Ubuntu 20.04 LTS, equipped with open-source platforms for emulating both CN and RAN. The CN is implemented using the open-source Open5GS platform. For the RAN, the system integrates a USRP B210 device with srsRAN software [11]. The synchronization was provided by the internal Temperature-Compensated Crystal Oscillator (TCXO) of the USRP B210 rather than an external clock source, which offers a frequency accuracy of ± 2   ppm  [24]. As the srsRAN does not incorporate a UE application, evaluation of the performance metrics was carried out using a COTS UE (Motorola Edge 50 Pro smartphone (Motorola Mobility LLC., Chicago, IL, USA)). The device was configured to operate under 5G SA mode, in the TDD n78 frequency band (3300 MHz to 3800 MHz) of the frequency range FR1 (410 MHz to 7125 MHz), commonly known as the sub-6 GHz band. Internet connectivity was provided by a Huawei 5G Customer Premises Equipment (CPE) Pro device (Huawei Device Co., Ltd., Shenzhen, China).
Figure 2. SDR-based deployment scheme. Logical architecture of the experimental setup.
Figure 3. SDR-based equipment. Physical laboratory implementation and hardware components.
The setup involves a single cell operating in the NR n78 band, configured with an Absolute Radio Frequency Channel Number (ARFCN) value equal to 650000 (3750 MHz central frequency). The cell uses TDD and employs numerology μ = 1 (30 kHz SCS) with a normal cyclic prefix. The allocated bandwidth for the cell is 20 MHz, limited by the possible configurable values with the srsRAN software. The deployment is configured with a single network slice, designed to serve eMBB services.
Prerequisites. Prior to running the network on Ubuntu, the following steps were completed:
  • Configure MongoDB [25], an open-source database management system that uses JavaScript Object Notation (JSON)-based data models;
  • Configure the USRP Hardware Driver (UHD) [26], an open-source driver that provides a programming interface for USRP hardware;
  • Configure a virtual TUN (network TUNnel) interface named ogstun. The virtual TUN interface enables packet handling, routing, and encapsulation, essential for supporting data plane connectivity in the testbed;
  • Enable Internet Protocol (IP) forwarding;
  • Set Network Address Translation (NAT) rules in iptables and ip6tables to ensure connectivity between the UPF and the Internet;
  • Disable the system firewall.
All commands [27] were executed in the terminal as follows: Technologies 14 00177 i001
RAN and CN configuration. The Public Land Mobile Network (PLMN) ID—consisting of the Mobile Country Code (MCC) and the Mobile Network Code (MNC)—is configured within the AMF of the CN, the RAN domain, and at the beginning of the International Mobile Subscriber Identity (IMSI) value, with a test value of “999970” so that it does not interfere with commercial PLMN values. The tracking area code is configured within the AMF of the CN and the RAN domain with the same test value of “1”.
Additionally, the ARFCN is configured to 650000 within the srsRAN configuration file, the same value as in the commercial deployment, to facilitate the comparison between the two implementations.
The following modifications were applied within the configuration files of the srsRAN open-source platform [28]: Technologies 14 00177 i002
The following modifications were applied within the configuration files of the Open5G open-source platform [27]: Technologies 14 00177 i003
UE configuration. The mobile terminal is equipped with a sysmoISIM-SJA2 Subscriber Identity Module (SIM) card, provisioned beforehand using an HID Omnikey 3121 (HID Global Corp., Austin, TX, USA) smart card reader and writer, and the PySim (Release v2.4.0) [29] utility. The card is subsequently read to verify that the MCC, MNC, and IMSI values align with those configured in the CN and RAN, as consistency among these identifiers is essential for successful network registration. Additionally, services related to the Subscription Concealed Identifier (SUCI) are disabled, such as 124 (Subscription identifier privacy support) and 125 (SUCI calculation by the Universal Subscriber Identity Module (USIM)) [30]. Deactivating both services ensures the SUCI feature is completely disabled. Implementing SUCI deconcealment requires complex key provisioning on both the UDM and the SIM card, which brings no added value to the radio performance metrics evaluated in this study. Furthermore, as commercial 5G SA networks currently see limited mandatory SUCI adoption (with many UE vendors operating optimally using the standard Subscription Permanent Identifier (SUPI)), disabling it ensures a streamlined and stable registration process for the selected UE. All commands [31] were executed in the terminal within the directory containing the cloned and installed instance of PySim from GitHub (https://github.com/osmocom/pysim; accessed 27 October 2025): Technologies 14 00177 i004
The SIM profile data is subsequently added to the prototype network’s subscriber database, the UDR of the CN.
The configured parameters reflect the values stored on the SIM card, such as IMSI, the Subscriber Authentication Key (Ki), and the Derived Operator Code (OPc) obtained from the Operator Code (OP), as well as some values that ensure consistency across the network configuration, such as the Access Point Name (APN) and the IP version 4 (IPv4) settings.
The database is accessed through the graphical user interface provided by the Open5GS open-source platform after first specifying the IP address and the port that will be opened in a web browser. All commands [27] were executed within the directory containing the cloned and installed instance of Open5GS from GitHub (https://github.com/open5gs/open5gs; accessed 27 October 2025): Technologies 14 00177 i005
Several UE settings must be applied to ensure compatibility with the prototype network. First, an APN must be created using the same parameters as those defined in the UDR of the core network, specifically the APN name, the APN protocol set to IPv4, and the APN roaming protocol set to IPv4. Additionally, the preferred network type should be set to 5G, as this selection enables the device to operate within the intended radio access specifications. Finally, according to [28], Voice over LTE (VoLTE) and Voice over NR (VoNR) should be disabled, as doing so helps maintain operational consistency within the prototype environment.
UE testing. Before running the network, it is recommended to verify that the SDR device is properly connected to the laptop by using the following command [32]: Technologies 14 00177 i006
The 5G SA core network is launched by executing each functional component within its architecture: NRF, Service Communication Proxy (SCP), Security Edge Protection Proxy (SEPP), AMF, SMF, UPF, AUSF, UDM and UDR, Policy Control Function (PCF), Network Slicing Selection Function (NSSF), and Binding Support Function (BSF). All commands [27] were executed within the directory containing the cloned and installed instance of Open5GS from GitHub (https://github.com/open5gs/open5gs; accessed 27 October 2025): Technologies 14 00177 i007
The 5G SA RAN network is launched by executing the configuration file that was previously set up using the srsRAN open-source platform on the SDR equipment. All commands [28] were executed within the directory containing the cloned and installed instance of srsRAN from GitHub (https://github.com/srsran/srsRAN_Project; accessed 27 October 2025). Technologies 14 00177 i008
The network testing procedure was carried out with the UE connected to the WiFi interface of the CPE, using the iPerf3 (v3.19) and Ping Tools (v4.64) applications. Data throughput was evaluated through file transfer tests, while latency was assessed via ping tests, allowing for a comprehensive measurement of key network performance parameters. In the throughput tests, iPerf3 was executed on the laptop as the server, configured with the ogstun interface at IP address 10.45.0.1 and using the default port 5201: Technologies 14 00177 i009
The mobile phone acted as the client, running the iPerf3 application to connect to the server using the specified credentials. During the tests, the client on the phone established a connection from 10.45.0.4 (local port 41372) to the server, enabling real-time measurement of uplink and downlink data rates between the device and the network.
Ping tests zwere conducted to assess the round-trip time, providing insight into the network latency and responsiveness under typical operational conditions. This combined methodology ensures a reliable evaluation of both throughput and latency characteristics of the prototype network.

3.2. Commercial-Based 5G SA Testbed

The laboratory environment setup (Figure 4 and Figure 5) includes all necessary network components and user equipment, allowing for realistic and repeatable testing conditions. The setup involves the use of one commercial Motorola Edge 50 Pro smartphone, configured to operate under the 5G SA mode, in the TDD n78 frequency band of the frequency range FR1. The device is connected to an disaggregated RAN network, which is deployed within the test area using commercial-grade equipment (Benetel RAN550) and software (Accelleran) and open-source software (Open5GS).
Figure 4. Commercial-based deployment scheme. Logical architecture of the experimental setup.
Figure 5. Commercial-based equipment. Physical laboratory implementation and hardware components.
The core element of the implementation is a COTS server which, through virtualization, consolidates most of the logical functions of the network. This server runs a commercial radio software solution provided by Accelleran [15], functioning as a virtual Baseband Unit (vBBU) that implements the Distributed Unit (DU) and Centralized Unit (CU) under a Kubernetes-based infrastructure. The server also provides Out-of-band (OOB) management, using a secondary interface physically isolated from the primary network connection to remotely control and monitor network equipment.
Accelleran is an Open RAN software vendor focused on disaggregated RAN control and automation. Its platform centers on a cloud-native near-Real-Time RAN Intelligent Controller (near-RT RIC) with an xApp/rApp framework, policy engine, and observability hooks that expose standard O-RAN interfaces (E2/A1/O1/O2) for multi-vendor interoperability. In private and public RAN deployments, Accelleran typically pairs with third-party RUs and CU/DU stacks, providing traffic steering, optimization, and assurance functions that can be rolled out and updated through Kubernetes-based continuous integration/continuous delivery [15].
The vBBU communicates via the Open Fronthaul (O-FH) interface, implemented over a fiber optic connection to the Benetel RAN550 RU [33]. The same server also hosts the core network, which is deployed using the open-source Open5GS solution [14]. Internet connectivity is provided by a Huawei 5G CPE Pro device, while TDD synchronization is ensured via a Global Positioning System (GPS) connection.
Similar to the open-source research implementation, the test setup involves a single cell operating in the New Radio (NR) n78 band, configured with ARFCN equal to 650000, (3750 MHz central frequency). The cell used TDD and employed numerology μ = 1 (30 kHz SCS) with a normal cyclic prefix. The allocated bandwidth for the cell is 100 MHz, supporting high-throughput transmission. The deployment is configured with a single network slice, designed to serve eMBB services, providing consistent performance across both the radio and core network segments.
The throughput performance evaluation measurements are conducted using the Orange Speedtest application [34], which includes predefined Transmission Control Protocol (TCP) download and upload tests, each with a duration of 30 s. Additionally, Internet Control Message Protocol (ICMP) ping tests are used to assess latency (RTT).

4. Results

This section outlines the radio conditions, including the deployment parameters and configuration settings. It then presents the resulting performances obtained from the two 5G SA setups—covering DL and UL throughput, as well as latency—thereby providing a clear foundation for understanding how each system operates under controlled and reproducible radio conditions.

4.1. Network Conditions

The measurements were performed under specific radio conditions, which are described in this paragraph using metrics such as RSRP, Reference Signal Received Quality (RSRQ), Signal-to-Interference-plus-Noise Ratio (SINR), and Modulation and Coding Scheme (MCS). The average RSRP, RSRQ and SINR values observed along the monitored signal paths are presented in Table 3. When compared to the benchmark thresholds for RSRP, RSRQ, and SINR defined in [35], the measured values indicate excellent signal quality. The MCS values provide further insight into link performance.
Table 3. Average network performance metrics.
The selection of the MCS is governed by the link adaptation logic, which uses specific SINR thresholds and Channel Quality Indicator (CQI) mappings to ensure link stability. For the SDR-based testbed, the srsRAN gNB maps the CQI reported by the UE to an MCS using TS 138 214, Table 5.1.3.1-1 [36]: −1, 0, 0, 2, 4, 6, 8, 11, 13, 15, 18, 20, 22, 24, 26, 28, as defined in the srsRAN gNB scheduler source code (lib/scheduler/support/mcs_calculator.cpp) [11]. In this configuration, MCS 28 represents the highest possible index. Conversely, the commercial testbed maps the CQI to an MCS using TS 138 214, Table 5.1.3.1-2: −1, 0, 1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27 [36], where MCS 27 is the maximum index.
For the UL, the SDR-based testbed achieved MCS 20, using 64QAM modulation with a Target Code Rate of 567/1024, while the commercial setup achieved MCS 18, which also employs 64QAM but operates at a higher Target Code Rate of 822/1024. For the Downlink (DL), the SDR-based setup achieved MCS 28, resulting in the use of 64QAM modulation with a Target Code Rate of 948/1024. The commercial setup, however, achieved MCS 26, utilizing 256QAM modulation with a Target Code Rate of 916.5/1024.

4.2. Performance Evaluation

The testing plan focuses on evaluating user-level latency and overall throughput in both DL and UL through a combination of file transfer and ICMP ping tests.
Table 4 summarizes the average values recorded during the DL, UL, and ping sessions for both testbeds, providing a consolidated view of their performance under the evaluated conditions. Reported values are obtained by averaging throughput and latency measurements over multiple sessions.
Table 4. Average performance results.
Open-source research testbed (srsRAN). Under the SDR-based testbed, the average DL throughput measured approximately 296.4 Mbit/s, whereas the average UL throughput reached 32.6 Mbit/s. In terms of variability, the UL rate ranged from 10 Mbit/s to 48.7 Mbit/s. In contrast, the DL performance demonstrated comparatively higher values, with a minimum of 226.8 Mbit/s and a maximum of 488.6 Mbit/s. The overall throughput performance suggests that the system can effectively support high-data-rate applications under the eMBB network segment. Moreover, latency and jitter were assessed to evaluate the responsiveness of the system. The average end-to-end latency measured approximately 43.4 ms, with minimum and maximum values of 28 ms and 53 ms, respectively. In terms of jitter, the system exhibited an average variation of 15.5 ms, with fluctuations ranging between 8 and 25 ms.
Commercial-based testbed (Accelleran). From a comparative perspective, the commercial-based testbed recorded an average DL throughput of 28 Mbit/s, with values ranging from 14.7 Mbit/s to 40.9 Mbit/s, while the UL throughput averaged 14.4 Mbit/s, with observed values varying from 3.1 Mbit/s to 17.9 Mbit/s. Latency analysis revealed an average round-trip time of 47.4 ms, ranging between 27 and 73 ms. This range highlights the network’s responsiveness under varying conditions, where lower latency supports real-time applications and occasional peaks indicate transient congestion or processing delays.

5. Discussion

This section offers a detailed analysis of the experimental findings, comparing the performance of the two testbeds and examining how these results align with or diverge from those reviewed in Section 2. Further, this section also considers the factors that contribute to the observed differences, followed by an evaluation of cost, flexibility, and reproducibility aspects that distinguish open-source research platforms from commercial 5G solutions.

5.1. Performance Assessment

The performance results from Table 5 allow for a comparative analysis of network behavior. To ensure a consistent basis for comparison, both setups were implemented with matching parameters. The only differentiating factor is the bandwidth configuration, which was limited in each testbed by the capabilities of the software platform. While we acknowledge that the performance of a 5G SA system is influenced by the specific hardware specifications of the UE, a detailed evaluation of performance deviations across multiple UE models was not within the intended scope of this research. To ensure a consistent baseline for our assessment of the trade-offs between SDR-based prototypes and commercial deployments, all measurements in this study were conducted using a single 5G SA capable UE (Motorola Edge 50 Pro), thereby ensuring the results reflect the characteristics of the network implementation rather than terminal variability.
Table 5. Comparison of commercial and SDR-based Open RAN implementations.
A cross-testbed comparison indicates that the two setups reveal a clear similarity in throughput behavior despite the difference in bandwidth. In both configurations, the relative relationship between DL and UL performance remains consistent, reflecting the typical asymmetric traffic patterns in mobile networks where download demand dominates, shaped by user demand. Both testbeds report relatively high latency, suggesting that the system may face challenges in supporting delay-sensitive services. Incorporating a URLLC segment could enable a more effective allocation of users and services to support low latency communication, as high throughput and low latency are typically conflicting requirements that cannot be fully achieved simultaneously.
Despite this similarity, notable differences were observed between the setups due to variations in BW. The commercial-based setup achieved higher absolute DL (490 Mbit/s) and UL (49 Mbit/s) throughput compared to the prototype setup. Round-trip latency was, on average, similar across the configurations, although it was slightly higher for the commercial-based architecture due to increased interference within the 100 MHz BW (same BW used by Orange Romania for macro coverage). These differences once again highlight the direct impact of bandwidth constraints on overall performance.
Moreover, while both testbeds reflect typical asymmetric traffic patterns, the uplink performance in the open-source setup falls significantly below theoretical limits. This imbalance is primarily due to the open-source SDR variant’s limited ability to process uplink traffic efficiently. Key factors contributing to this bottleneck include high processing overheads. These overheads are primarily driven by the absence of hardware-level optimizations typical of commercial deployments. In commercial core networks, the UPF is usually isolated in a dedicated environment utilizing pinned Central Processing Unit (CPU) cores and dedicated network interfaces to accelerate packet processing and minimize jitter. Furthermore, commercial RAN BBUs utilize custom processors or leverage hardware accelerators—such as Systems-on-Chip (SoCs) with dedicated processing units for RAN applications—to handle computationally intensive baseband tasks. Conversely, the open-source setup executes both the Open5GS and srsRAN stacks concurrently on a shared, general-purpose CPU without strict hardware isolation, resource pinning, or dedicated accelerators, inherently introducing higher processing overhead and latency. Collectively, these constraints force the system to operate using lower modulation schemes to maintain the link (e.g., relying on MCS 20 for UL versus MCS 28 for DL).
Focusing on the prototype setup, the observed throughput and latency characteristics indicate that it provides satisfactory performance for a prototype deployment. Nevertheless, certain limitations are apparent: the measured round-trip times remain higher than desirable for real-time applications, and the bandwidth constraints imposed by the software restrict the maximum achievable throughput. The prototype implementation using SDR and open-source platforms is not highly stable, particularly in terms of UE connectivity. Intermittent UE attachment behavior was observed across repeated trials, even though the same network configuration and identical parameter settings were consistently applied.
This highlights the challenges of reproducibility and robustness when working with prototype solutions. Despite these limitations, the setups offer a valuable testbed for further experimentation and optimization, providing insights into both research-oriented and commercial deployment scenarios.
Overall, these comparisons highlight the impact of selecting specific equipment solutions within the network configuration on overall performance and provide insights into potential areas for optimization.

5.2. Benchmarking with Existing Work

Although direct comparisons with other studies are limited due to differences in implementation parameters, approximate (qualitative) comparisons can still be made with the works presented in Table 2. These comparisons are largely observational, as the experimental conditions and configurations are closely aligned but not identical.
When compared to other open 5G SA prototypes at similar bandwidths (Table 6), the proposed open-source research testbed is placed at the lower end in terms of DL throughput and at the high end in terms of latency. For instance, Amini’s [18] OAI-based 20 MHz SA testbed reaches 56/9 Mbit / s (DL/UL) with 10 ms RTT, while the 40 MHz testbed achieves 114/20 Mbit / s at the same latency. Håkegård’s [20] 40 MHz srsRAN/Open5GS setup reports 123/39 Mbit / s (QAM64) with 17 ms RTT, and Marțian’s [22] 40 MHz and 80 MHz srsRAN/Open5GS testbeds reach up to 187 Mbit/s on DL, up to 48 Mbit/s UL, and comparable or lower latency. Thus, while the UL values of the SDR-based research testbed are in line with several open-source results, the DL throughput and RTT are clearly more conservative.
Table 6. Comparison of measured network performance with values reported in the literature.
The commercial-based testbed achieves substantially higher data rates, with average 296 Mbit/s DL/33 Mbit/s UL and 47 ms RTT, and peak values around 490 Mbit/s DL/49 Mbit/s UL with 27 ms minimum latency. Compared to the commercial Nokia NDAC campus network [10] in Table 2 (4 × 4 MIMO), which reaches 376 Mbit/s to 749 Mbit/s on DL and 97 Mbit/s to 233 Mbit/s on UL with 6.4 ms to 11.4 ms latency, the commercial setup in this study offers lower absolute throughput and significantly higher RTT but still clearly outperforms most SDR-based prototypes in raw downlink capacity. Relative to the Ericsson industrial deployments [9] where mid-band SA/NSA systems achieve 5 ms RTT, the latency of the presented commercial-based testbed is again higher, underlining that it is closer to a laboratory-grade open deployment than to a fully tuned URLLC-oriented solution.
Therefore, the contribution of this work lies in introducing an alternative implementation approach, along with its corresponding performance results, and in providing a novel perspective for comparing two implementations—one research-oriented and one commercial. This approach allows for evaluating the relative benefits and limitations of each deployment strategy, offering insights that can guide future network design and optimization efforts.

5.3. Cost, Flexibility, and Reproducibility Considerations

The open-source research prototype is highly portable, consisting solely of a laptop and a compact SDR device, with dimensions of 9.7 cm × 15.5 cm × 1.5 cm, weighing only 350 g [24], and offers a significant cost advantage compared to commercial solutions. Despite its portability and affordability, this solution is intended exclusively as a prototype, serving as a foundation for further development rather than a fully deployable network. The use of a real UE instead of a simulated UE in the proposed testbed allows for a more realistic characterization of network behavior.
From a reproducibility perspective, the prototype is particularly advantageous for academic research, as it can be easily replicated by researchers with access to open-source resources, allowing experimentation and validation without reliance on proprietary platforms. In contrast, the commercial implementation is deployed within a mobile network operator’s laboratory and closely approximates a deployable commercial solution. However, it is considerably more expensive, physically restricted to a single location, and requires ongoing support from the companies that provide and maintain the open-source platforms, limiting its accessibility and reproducibility for independent academic work.

6. Conclusions and Future Work

This work presents the deployment and evaluation of two 5G SA testbeds: an open, SDR-based prototype and a commercial solution, implemented under comparable laboratory conditions. Through a focused literature review and a comparative experimental analysis based on point-to-point throughput and latency measurements, this study investigates the feasibility and performance of open, disaggregated RAN architectures using open-source software, commercial hardware components, and SDR equipment. Furthermore, this paper provides an implementation based on a commercial software, Accelleran, which, as indicated by the literature review conducted for this study, has not been reported in prior work. Finally, the results emphasize the fundamental trade-offs between cost, portability, and scalability when contrasting 5G SA SDR-based experimental deployments with commercial 5G SA solutions, providing a new perspective on the adoption of open 5G systems in controlled laboratory testbed environments.
The experimental evaluation highlights that both 5G SA configurations provide stable connectivity and consistent performance under laboratory conditions while revealing clear differences between the two implementations. For the DL transmission, the SDR-based system achieved an average throughput of 28 Mbit/s with a maximum of 41 Mbit/s, whereas the commercial deployment reached an average of 296 Mbit/s and a peak value of 490 Mbit/s; a similar trend was observed for the UL transmission, where average throughputs of 14 Mbit/s and 33 Mbit/s and maximum values of 18 Mbit/s and 49 Mbit/s were recorded for the two systems, respectively. Regarding latency, the SDR-based prototype exhibited an average round-trip time of 44 ms with a minimum value of 28 ms, while the commercial solution achieved an average latency of 47 ms and a minimum of 27 ms.
Overall, both testbeds deliver satisfactory performance within the specified parameters, with differences arising from BW discrepancies. The open-source research setup demonstrates promising performance despite software-imposed bandwidth limitations, revealing enhanced flexibility and reproducibility. One notable limitation of the SDR-based prototype is the instability in UE connectivity. Repeated tests under identical configurations showed intermittent UE attachment, which can complicate reliable performance evaluation. Still, the research SDR-based testbed confirms its strong suitability for research and disaggregated RAN prototyping. Taken altogether, this study confirms the feasibility of the proposed implementation approaches and provides a framework for future comparative evaluations.
For future studies, it is important that both the open-source research and commercial implementations consider a wider range of UE models, the inclusion of virtual UEs, and alternative frequency bands or radio configurations. Additionally, a key direction for subsequent research involves simulating the impact of varying TDD slot ratios on asymmetric downlink and uplink throughput, evaluating how modified DL/UL frame structures affect the processing overhead and stability of the srsRAN and Accelleran stacks. Such enhancements would provide a more comprehensive evaluation of system performance and better reflect the diverse scenarios encountered in real-world disaggregated RAN deployments, thereby creating more robust and scalable network designs. Moreover, the architectures developed in this study could be extended to incorporate a near-RT RIC, a key component that would transform the network from a disaggregated RAN into Open RAN, enabling advanced control and optimization functionalities. Overall, these considerations are crucial for guiding the evolution of Open RAN solutions from experimental setups to fully deployable networks capable of meeting the demands of modern mobile communication systems.

Author Contributions

Conceptualization, T.-C.S. and A.M.; Methodology, T.-C.S. and A.M.; Software, R.-M.M.; Validation, T.-C.S., R.-M.M., E.S. and A.M.; Formal Analysis, T.-C.S. and E.S.; Investigation, T.-C.S., R.-M.M. and E.S.; Resources, R.-M.M. and C.P.-S.; Data Curation, T.-C.S., E.S. and R.-M.M.; Writing—Original Draft Preparation, T.-C.S., R.-M.M. and E.S.; Writing—Review and Editing, T.-C.S., E.S. and A.M.; Visualization, E.S.; Supervision, T.-C.S. and R.-M.M.; Project Administration, T.-C.S.; Funding Acquisition, R.-M.M., A.M. and C.P.-S. All authors have read and agreed to the published version of the manuscript.

Funding

The research leading to these results, as well as the Article Processing Charge (APC), was funded by the European Commission’s Digital Europe Programme (DIGITAL) research and innovation program under grant agreement #101127973, 5G-TACTIC “5G Trusted And seCure network servICes” project and by the grant of the Ministry of Research, Innovation and Digitization, CCCDI-UEFISCDI, project number PN-1V-P7-7.1-PED-2024-0741, within PNCDI IV.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

The experimental work was conducted in the Orange Future Networks 6G Lab, an advanced experimental facility created in partnership between Orange Romania and the CAMPUS research centre of the NUSTPB university.

Conflicts of Interest

Authors Razvan-Marius Mihai and Cristian Patachia-Sultanoiu are employed by the company Orange Romania. The remaining authors declare that this research was conducted in the absence of any commercial or financial relationships that could be construed as potential conflicts of interest. The authors declare that this study received funding from the European Commission’s Digital Europe Programme (DIGITAL) research and innovation program under grant agreement #101127973, 5G-TACTIC “5G Trusted And seCure network servICes” project and from the grant of the Ministry of Research, Innovation and Digitization, CCCDI-UEFISCDI, project number PN-1V-P7-7.1-PED-2024-0741, within PNCDI IV. The funders were not involved in the study design, collection, analysis, interpretation of data, the writing of this article or the decision to submit it for publication.

Abbreviations

The following abbreviations are used in this manuscript:
3GPP3rd Generation Partnership Project
4GFourth-generation
5GFifth-generation
5GC5G Core
AMFAccess and Mobility Management Function
APNAccess Point Name
ARFCNAbsolute Radio Frequency Channel Number
AUSFAuthentication Server Function
BBUBaseband Unit
BSBase Station
BSFBinding Support Function
BWBandwidth
CAMPUSCenter for Advanced Research on New Materials, Products and Innovative Processes
COTSCommercial Off-the-Shelf
CPECustomer Premises Equipment
CPUCentral Processing Unit
CUCentralized Unit
CNCore Network
CQIChannel Quality Indicator
DLDownlink
DUDistributed Unit
eMBBenhanced Mobile Broadband
FCFraunhofer Core
FR1Frequency Range 1
GPSGlobal Positioning System
HWHardware
IMSIInternational Mobile Subscriber Identity
IPInternet Protocol
IPv4IP version 4
IPv6IP version 6
JSONJavaScript Object Notation
LTELong-Term Evolution
MCCMobile Country Code
MCSModulation and Coding Scheme
MIMOMultiple Input Multiple Output
MNCMobile Network Code
NATNetwork Address Translation
NDACNokia Digital Automation Cloud
NRNew Radio
NRFNetwork Repository Function
NSANon-Standalone
NSSFNetwork Slice Selection Function
NUSTPBNational University of Science and Technology Politehnica Bucharest
OAIOpenAirInterface
O-FHOpen Fronthaul
OOBOut-of-band
O-RANOpen-Radio Access Network
P2PPoint-to-point
PCFPolicy Control Function
PLMNPublic Land Mobile Network
QAMQuadrature Amplitude Modulation
RANRadio Access Network
RBResource Block
RFRadio Frequency
RICRAN Intelligent Controller
RRHRemote Radio Head
RSRPReference Signal Received Power
RSRQReference Signal Received Quality
RTTRound-Trip Time
RURadio Unit
SAStandalone
SCPService Communication Proxy
SCSSubcarrier Spacing
SDRSoftware-Defined Radio
SEPPSecurity Edge Protection Proxy
SIMSubscriber Identity Module
SINRSignal-to-Interference-plus-Noise Ratio
SISOSingle Input Single Output
SoCSystem-on-Chip
SMFSession Management Function
srsRANSoftware Radio Systems RAN Project
SUCISubscription Concealed Identifier
SWSoftware
TCXOTemperature-Compensated Crystal Oscillator
TDDTime Division Duplexing
TIPTelecom Infra Project
UEUser Equipment
UHDUSRP Hardware Driver
UDMUnified Data Management
UDRUnified Data Repository
UiAUniversity of Agder
ULUplink
UPFUser Plane Function
URLLCUltra-Reliable Low-Latency Communications
USIMUniversal Subscriber Identity Module
USRPUniversal Software Radio Peripheral
vBBUvirtualized Baseband Unit
VoLTEVoice over LTE
VoNRVoice over NR

References

  1. O-RAN Alliance. 2025. Available online: https://www.o-ran.org (accessed on 10 September 2025).
  2. Telecom Infra Project (TIP). 2025. Available online: https://telecominfraproject.com/ (accessed on 10 September 2025).
  3. rd Generation Partnership Project (3GPP). 2025. Available online: https://www.3gpp.org/ (accessed on 10 September 2025).
  4. Bahl, P.; Balkwill, M.; Foukas, X.; Kalia, A.; Kim, D.; Kotaru, M.; Lai, Z.; Mehrotra, S.; Radunovic, B.; Saroiu, S.; et al. Accelerating Open RAN Research Through an Enterprise-scale 5G Testbed. In Proceedings of the 29th Annual International Conference on Mobile Computing and Networking, Madrid, Spain, 2–6 October 2023; p. 138. [Google Scholar]
  5. Gao, Y.; Zhang, X.; Yuan, H. Integration and Connection Test for OpenAirInterface 5G Standalone System. In Proceedings of the 2021 IEEE 3rd International Conference on Civil Aviation Safety and Information Technology (ICCASIT), Changsha, China, 20–22 October 2021; pp. 1010–1014. [Google Scholar]
  6. Mehran, F.; Turyagyenda, C.; Kaleshi, D. Experimental Evaluation of Multi-Vendor 5G Open RANs: Promises, Challenges, and Lessons Learned. IEEE Access 2024, 12, 152241–152261. [Google Scholar] [CrossRef]
  7. Mihai, R.; Craciunescu, R.; Martian, A.; Li, F.Y.; Patachia, C.; Vochin, M.C. Open-Source Enabled Beyond 5G Private Mobile Networks: From Concept to Prototype. In Proceedings of the 2022 25th International Symposium on Wireless Personal Multimedia Communications (WPMC), Herning, Denmark, 30 October–2 November 2022; pp. 181–186. [Google Scholar]
  8. Tufeanu, L.M.; Martian, A.; Vochin, M.C.; Paraschiv, C.L.; Li, F.Y. Building an Open Source Containerized 5G SA Network through Docker and Kubernetes. In Proceedings of the 2022 25th International Symposium on Wireless Personal Multimedia Communications (WPMC), Herning, Denmark, 30 October–2 November 2022; pp. 381–386. [Google Scholar]
  9. Ansari, J.; Andersson, C.; de Bruin, P.; Farkas, J.; Grosjean, L.; Sachs, J.; Torsner, J.; Varga, B.; Harutyunyan, D.; König, N.; et al. Performance of 5G Trials for Industrial Automation. Electronics 2022, 11, 412. [Google Scholar] [CrossRef]
  10. Mallikarjun, S.B.; Schellenberger, C.; Hobelsberger, C.; Schotten, H.D. Performance Analysis of a Private 5G SA Campus Network. In Proceedings of the Mobile Communication—Technologies and Applications; 26th ITG-Symposium, Osnabrueck, Germany, 18–19 May 2022; pp. 1–5. [Google Scholar]
  11. srsRAN Project. 2025. Available online: https://www.srslte.com/ (accessed on 9 September 2025).
  12. OpenAirInterface. 2025. Available online: https://openairinterface.org/ (accessed on 9 September 2025).
  13. BubbleRAN. 2025. Available online: https://bubbleran.com/ (accessed on 11 September 2025).
  14. Open5GS. 2025. Available online: https://open5gs.org/ (accessed on 25 July 2025).
  15. Accelleran. 2025. Available online: https://accelleran.com/ (accessed on 25 July 2025).
  16. Vilakazi, M.; Burger, C.R.; Mboweni, L.; Mamushiane, L.; Lysko, A.A. Evaluating an Evolving OAI Testbed: Overview of Options, Building Tips, and Current Performance. In Proceedings of the 2021 7th International Conference on Advanced Computing and Communication Systems (ICACCS), Coimbatore, India, 19–20 March 2021; pp. 818–827. [Google Scholar]
  17. Dludla, G.; Vilakazi, M.; Burger, C.R.; Lysko, A.A.; Ngcama, L.; Mboweni, L.; Masonta, M.; Kobo, H.; Mamushiane, L. Testing Performance for Several Use Cases with an Indoor OpenAirInterface USRP-Based Base Station. In Proceedings of the 2022 5th International Conference on Multimedia, Signal Processing and Communication Technologies (IMPACT), Aligarh, India, 26–27 November 2022; pp. 1–5. [Google Scholar]
  18. Amini, M.; El-Ashmawy, A.; Rosenberg, C. Implementing an Open 5G Standalone Testbed: Challenges and Lessons Learnt. In Proceedings of the IEEE INFOCOM 2023—IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS), Hoboken, NJ, USA, 20–20 May 2023; pp. 1–2. [Google Scholar]
  19. Sahbafard, A.; Schmidt, R.; Kaltenberger, F.; Springer, A.; Bernhard, H.P. On the Performance of an Indoor Open-Source 5G Standalone Deployment. In Proceedings of the 2023 IEEE Wireless Communications and Networking Conference (WCNC), Glasgow, UK, 26–29 March 2023; pp. 1–6. [Google Scholar]
  20. Håkegård, J.E.; Lundkvist, H.; Rauniyar, A.; Morris, P. Performance Evaluation of an Open Source Implementation of a 5G Standalone Platform. IEEE Access 2024, 12, 25809–25819. [Google Scholar] [CrossRef]
  21. Dória, M.; de Sousa Jr, V.A.; Campos, A.; Oliveira, N.; Eduardo, P.; Filho, P.; Lima, C.; Guilherme, J.; Luna, D.; Diógenes, I.; et al. Virtualized 5G Testbed using OpenAirInterface: Tutorial and Benchmarking Tests. J. Internet Serv. Appl. 2024, 15, 523–535. [Google Scholar] [CrossRef]
  22. Marțian, A.; Trifan, R.F.; Stoian, T.C.; Vochin, M.C.; Li, F.Y. Towards Open RAN in beyond 5G networks: Evolution, architectures, deployments, spectrum, prototypes, and performance assessment. Comput. Netw. 2025, 259, 111087. [Google Scholar] [CrossRef]
  23. Orange Future Networks 6G Lab Within the Campus Research Center. 2025. Available online: https://futurenetworks.ro/ (accessed on 15 June 2025).
  24. USRP B200. 2025. Available online: https://www.ettus.com/all-products/ub200-kit/ (accessed on 29 November 2025).
  25. MongoDB. 2025. Available online: https://www.mongodb.com/ (accessed on 9 September 2025).
  26. UHD Software API, Ettus Research. 2025. Available online: https://www.ettus.com/sdr-software/uhd-usrp-hardware-driver/ (accessed on 9 September 2025).
  27. Building Open5GS from Sources. 2025. Available online: https://open5gs.org/open5gs/docs/guide/02-building-open5gs-from-sources/ (accessed on 20 October 2025).
  28. srsRAN gNB with COTS UEs. 2025. Available online: https://docs.srsran.com/projects/project/en/latest/tutorials/source/cotsUE/source/index.html (accessed on 20 October 2025).
  29. Pysim. 2025. Available online: https://pypi.org/project/pysim/ (accessed on 9 September 2025).
  30. Guide: Enabling 5G SUCI. 2025. Available online: https://downloads.osmocom.org/docs/pysim/master/html/suci-tutorial.html (accessed on 13 August 2025).
  31. pySim-prog. 2025. Available online: https://osmocom.org/projects/pysim/wiki/PySim-prog (accessed on 20 October 2025).
  32. Identifying USRP Devices. 2025. Available online: https://files.ettus.com/manual/page_identification.html (accessed on 20 October 2025).
  33. Benetel RAN550, Indoor 5G Radio Unit. 2025. Available online: https://benetel.com/ran550/ (accessed on 25 July 2025).
  34. Orange Speedtest. 2025. Available online: http://www.speedtest.ro/ (accessed on 25 July 2025).
  35. Mobile Signal Strength Recommendations. 2025. Available online: https://wiki.teltonika-networks.com/view/Mobile_Signal_Strength_Recommendations (accessed on 24 July 2025).
  36. 3GPP. 5G; NR; Physical Layer Procedures for Data (Version 15.14.0 Release 15). Technical Report TS 138 214, 3rd Generation Partnership Project (3GPP), 2021. Available online: https://www.etsi.org/deliver/etsi_ts/138200_138299/138214/15.14.00_60/ts_138214v151400p.pdf (accessed on 2 March 2026).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.