Dynamic Spectrum Sharing for Future LTE-NR Networks †

5G is the next mobile generation, already being deployed in some countries. It is expected to revolutionize our society, having extremely high target requirements. The use of spectrum is, therefore, tremendously important, as it is a limited and expensive resource. A solution for the spectrum efficiency consists of the use of dynamic spectrum sharing, where an operator can share the spectrum between two different technologies. In this paper, we studied the concept of dynamic spectrum sharing between LTE and 5G New Radio. We presented a solution that allows operators to offer both LTE and New Radio services using the same frequency bands, although in an interleaved mode. We evaluated the performance, in terms of throughput, of a communication system using the dynamic spectrum sharing feature. The results obtained led to the conclusion that using the dynamic spectrum sharing comes with a compromise of a maximum 25% loss on throughput. Nevertheless, the decrease is not that substantial, as the mobile network operator does not need to buy an additional 15 MHz of bandwidth, using the already existing bandwidth of LTE to offer 5G services, leading to cost reduction and an increase in spectrum efficiency.


Introduction
The next mobile generation following Long Term Evolution (LTE) that has started its deployment in some countries is the 5th Generation (5G). The main focus of LTE was set to increase data transfer rates, while 5G is expected to revolutionize our society, focusing not only on delivering extreme mobile broadband, but also in the fields of critical machine communication and massive machine communication. New applications will emerge, and the target values and requirements proposed are extremely demanding [1]. The main challenges of 5G systems consist of increasing data transfer rates, reducing latency, and increasing capacity, spectrum efficiency, and network energy efficiency, which will be necessary for different application scenarios [2].
The current network architecture cannot sustain all the requirements and target values of 5G New Radio (NR). Therefore, the 3rd Generation Partnership Project (3GPP) released two variations for the new network architecture of 5G NR communication systems: Non-standalone (NSA) and Standalone (SA) [3]. The main difference between both types is that the NSA architecture is based and depends on the LTE core network, while the SA architecture uses a novel next-generation core network, not depending of any LTE infrastructure.
The Internet of Things (IoT), mobile internet, and Cognitive Radio (CR) stand as relevant driving forces for 5G development [4][5][6][7]. The IoT technology has the potential ability to connect almost everything to the internet, which will lead to the massive growth of devices that require network acceptance. Particularly, in 2018, there were approximately 22 billion connected devices, which is the equivalent of around 2.9 devices/person. It It is clear that the frequency spectrum is a scarce and limited resource that constitutes an important factor in mobile communication systems, as well as the related cost for the Mobile Network Operator (MNO). In this context, new spectrum explorations [11,12], higher energy efficiency [13], and dynamic spectrum usage [14][15][16][17][18] have become the new features of communication networks.
The topic of spectrum sharing in the bands of old communication systems started drawing the attention of researchers, as it is the safest and most economical solution [19]. The standardization procedure for the spectrum sharing principles started in March 2017 by 3GPP. One of the solutions presented regarding the spectrum allocation for 5G NR systems comprises the use of the existing frequency spectrum used by the already deployed mobile generations.
Spectrum sharing is based on the flexibility of the physical layer and the fact that in the LTE network, all channels are assigned in the time-frequency domain. This way, the flexibility of the 5G NR radio interface can be used for reference signals, allowing dynamic configuration and minimizing collisions between NR and LTE during simultaneous data transmission. Consequently, there is the possibility of sharing a frequency domain within the same communication channel. A comprehensive overview on the different ways of spectrum sharing that has been investigated in recent years is found in [14]. In addition, in [20,21], new schemes and algorithms for dynamic spectrum sharing between Global System for Mobile Communications (GSM) and LTE technologies were investigated. Regarding the IoT, spectrum sharing is a preferable approach to cope with the conflicts between massive IoT connections and limited spectrum resources as it was discussed in [22][23][24][25]. It can also be used to solve vertical requirements and the competition in the acquisition of frequency bands between MNOs [26,27], as well as to improve spectrum utilization in Cognitive Radio (CR) and TV white space [28][29][30]. The leading mobile producers have shown massive interest in developing solutions for dynamic spectrum sharing. These are presented in Table 2. The Dynamic Spectrum Sharing (DSS) solution allows mobile network operators to offer LTE and NR services using the same frequency bands, although in an interleaved mode. This allows NR services without the need of acquiring new and dedicated frequency spectrum, antenna, or radio frequency units. The solution is intended to assist operators in the short-term rollout deployment of 5G services through LTE already-in-use spectrum. It is not intended to provide substantial performance, as that would necessitate new dedicated spectrum for NR, but to provide coverage, reduce costs, and improve spectrum efficiency for the operator. Figure 1 presents the DSS technology with LTE and NR sharing the same frequency band in comparison to using two separate bands for each technology.
between massive IoT connections and limited spectrum resources as it was discussed in [22][23][24][25]. It can also be used to solve vertical requirements and the competition in the acquisition of frequency bands between MNOs [26,27], as well as to improve spectrum utilization in Cognitive Radio (CR) and TV white space [28][29][30]. The leading mobile producers have shown massive interest in developing solutions for dynamic spectrum sharing. These are presented in Table 2. The Dynamic Spectrum Sharing (DSS) solution allows mobile network operators to offer LTE and NR services using the same frequency bands, although in an interleaved mode. This allows NR services without the need of acquiring new and dedicated frequency spectrum, antenna, or radio frequency units. The solution is intended to assist operators in the short-term rollout deployment of 5G services through LTE already-in-use spectrum. It is not intended to provide substantial performance, as that would necessitate new dedicated spectrum for NR, but to provide coverage, reduce costs, and improve spectrum efficiency for the operator. Figure 1 presents the DSS technology with LTE and NR sharing the same frequency band in comparison to using two separate bands for each technology. The deploying of DSS technology is divided into two phases: Phase 1, which is based only on the NSA architecture and accepts a sharing ratio between 20 and 60% with a fixed UL sharing ratio; and Phase 2, which introduces a dynamic UL sharing ratio and accepts both NSA and SA architectures. The main differences between both phases are presented in Table 3. The sharing ratio refers to the ratio of shared resources between both technologies. For example, considering a sharing ratio of 20% refers to 5G NR occupying 20% of the available resources, while LTE occupies 80%. Another example is for a sharing ratio of 60%, meaning that 5G NR uses 60% of the available resources while LTE uses only 40%. The deploying of DSS technology is divided into two phases: Phase 1, which is based only on the NSA architecture and accepts a sharing ratio between 20 and 60% with a fixed UL sharing ratio; and Phase 2, which introduces a dynamic UL sharing ratio and accepts both NSA and SA architectures. The main differences between both phases are presented in Table 3. The sharing ratio refers to the ratio of shared resources between both technologies. For example, considering a sharing ratio of 20% refers to 5G NR occupying 20% of the available resources, while LTE occupies 80%. Another example is for a sharing ratio of 60%, meaning that 5G NR uses 60% of the available resources while LTE uses only 40%.
For downlink, the allocation of the subframes is based on Time Division Multiplexing (TDM). In one frame, regardless of the sharing ratio adopted, subframes 0, 5, and 9 are strictly dedicated to LTE transmission. Subframes 1, 2, 3, 4, 6, 7, and 8 can be used for both LTE and NR transmission, depending on the sharing ratio and the architecture mode adopted, see Figure 2 [32]. For downlink, the allocation of the subframes is based on Time Division Multip (TDM). In one frame, regardless of the sharing ratio adopted, subframes 0, 5, and strictly dedicated to LTE transmission. Subframes 1, 2, 3, 4, 6, 7, and 8 can be used fo LTE and NR transmission, depending on the sharing ratio and the architecture adopted, see Figure 2 [32]. The downlink resource allocation, when considering the transmission of s frames, varies depending on the sharing ratio implemented [33]. Different patter depicted in Figure 3, for the NSA architecture mode. It can be observed that for frame, subframes 0, 5, and 9 are always dedicated to LTE. In addition, for the first only, slot 1 is represented with yellow (slot type B) and slot 2 with orange (slot ty and both are used for transmitting synchronization and CSI-RS signals, respective ditional synchronization signals are sent with a period of 20 ms in slot 1 of the rem frames, represented by green (slot type B**). The remaining slots are used for LTE a transmission.  The downlink resource allocation, when considering the transmission of several frames, varies depending on the sharing ratio implemented [33]. Different patterns are depicted in Figure 3, for the NSA architecture mode. It can be observed that for every frame, subframes 0, 5, and 9 are always dedicated to LTE. In addition, for the first frame only, slot 1 is represented with yellow (slot type B) and slot 2 with orange (slot type B*), and both are used for transmitting synchronization and CSI-RS signals, respectively. Additional synchronization signals are sent with a period of 20 ms in slot 1 of the remaining frames, represented by green (slot type B**). The remaining slots are used for LTE and NR transmission. For downlink, the allocation of the subframes is based on Time Division Multiplexing (TDM). In one frame, regardless of the sharing ratio adopted, subframes 0, 5, and 9 are strictly dedicated to LTE transmission. Subframes 1, 2, 3, 4, 6, 7, and 8 can be used for both LTE and NR transmission, depending on the sharing ratio and the architecture mode adopted, see Figure 2 [32]. The downlink resource allocation, when considering the transmission of several frames, varies depending on the sharing ratio implemented [33]. Different patterns are depicted in Figure 3, for the NSA architecture mode. It can be observed that for every frame, subframes 0, 5, and 9 are always dedicated to LTE. In addition, for the first frame only, slot 1 is represented with yellow (slot type B) and slot 2 with orange (slot type B*), and both are used for transmitting synchronization and CSI-RS signals, respectively. Additional synchronization signals are sent with a period of 20 ms in slot 1 of the remaining frames, represented by green (slot type B**). The remaining slots are used for LTE and NR transmission.  For uplink, the allocation of the resources is based on Frequency Division Multiplexing (FDM). As depicted in Figure 4, the Physical Resource Blocks (PRB) available for LTE or NR transmission are represented by the color green and depend on the sharing ratio adopted and carrier bandwidth. Furthermore, there are seven PRBs dedicated to NR UL transmission only, represented by the color yellow in the right outer edge of the frequency band. These PRBs depend on the positioning of the LTE PRACH PRBs. In Figure 4, these are located in the left outer edge of the frequency band. If the LTE PRACH PRBs were located in the right outer edge, the seven NR UL PRBs would then be positioned in the left outer edge (meaning the opposite side). For uplink, the allocation of the resources is based on Frequency Division Multiplexing (FDM). As depicted in Figure 4, the Physical Resource Blocks (PRB) available for LTE or NR transmission are represented by the color green and depend on the sharing ratio adopted and carrier bandwidth. Furthermore, there are seven PRBs dedicated to NR UL transmission only, represented by the color yellow in the right outer edge of the frequency band. These PRBs depend on the positioning of the LTE PRACH PRBs. In Figure 4, these are located in the left outer edge of the frequency band. If the LTE PRACH PRBs were located in the right outer edge, the seven NR UL PRBs would then be positioned in the left outer edge (meaning the opposite side). The calculation of the maximum available NR/LTE sharing ratio is given by the following equation: where N represents the number of PRBs that are necessary for LTE transmission, for instance, the LTE PRACH PRBs.
In this paper, we studied the concept of dynamic spectrum sharing between LTE and 5G NR technologies for the same mobile network operator. We assessed the performance of an LTE-NR communication system using the DSS feature, in terms of throughput, using different sharing ratios for both NSA and SA architectures and for both transmission directions (downlink and uplink). We performed a comparison of the performance while using different modulation schemes and numbers of layers. The remainder of the paper is organized as follows. Section 2 presents the sharing ratio calculation for downlink and uplink; Section 3 provides the equipment and methods used for the measurements; and Section 4 presents the results obtained and analysis. Lastly, Section 5 delivers the conclusions of this paper.

Sharing Ratio Calculation
The sharing ratio between LTE and NR is defined and managed by a new system unit denominated Common Resource Manager (CRM). It has the responsibility to compute the sharing ratio and update it according to traffic demands. In order to do so, the CRM continuously gathers information from both LTE and NR sites. The CRM component is composed by 3 objects: CRM, situated in the base station; LTE CRM, situated in the LTE system unit; NR CRM, situated in the NR system unit. Figure 5 presents the main The calculation of the maximum available NR/LTE sharing ratio is given by the following equation: where N represents the number of PRBs that are necessary for LTE transmission, for instance, the LTE PRACH PRBs.
In this paper, we studied the concept of dynamic spectrum sharing between LTE and 5G NR technologies for the same mobile network operator. We assessed the performance of an LTE-NR communication system using the DSS feature, in terms of throughput, using different sharing ratios for both NSA and SA architectures and for both transmission directions (downlink and uplink). We performed a comparison of the performance while using different modulation schemes and numbers of layers. The remainder of the paper is organized as follows. Section 2 presents the sharing ratio calculation for downlink and uplink; Section 3 provides the equipment and methods used for the measurements; and Section 4 presents the results obtained and analysis. Lastly, Section 5 delivers the conclusions of this paper.

Sharing Ratio Calculation
The sharing ratio between LTE and NR is defined and managed by a new system unit denominated Common Resource Manager (CRM). It has the responsibility to compute the sharing ratio and update it according to traffic demands. In order to do so, the CRM continuously gathers information from both LTE and NR sites. The CRM component is composed by 3 objects: CRM, situated in the base station; LTE CRM, situated in the LTE system unit; NR CRM, situated in the NR system unit. Figure 5 presents the main responsibilities of the CRM. Initially, it starts by gathering information from LTE and NR sites and, based on the information receives, it defines the resources to be shared. It then allocates the shared resources to both technologies. According to traffic conditions and demands for a specific moment, it evaluates the optimal sharing ratio to be selected and finally updates it for LTE and NR.
Sensors 2021, 21,4215 responsibilities of the CRM. Initially, it starts by gathering information from LTE sites and, based on the information receives, it defines the resources to be shared allocates the shared resources to both technologies. According to traffic conditi demands for a specific moment, it evaluates the optimal sharing ratio to be selec finally updates it for LTE and NR. For downlink, depending on the traffic demands for a specific moment, th receives information concerning load indication and takes a decision on the DL ratio that needs to be adopted. To calculate the DL load, the weighted load (based occupancy), average LTE DSS Guaranteed Bit Rate (GBR) load, NR DSS GBR lo NR PDCCH load need to be determined [35]. The algorithm for the sharing ratio tion for DL is presented in Figure 6. The first step consists of the verification, f CRM entity, of the average LTE GBR load, as well as the NR PDCCH load again fined threshold, so that a decision can be taken regarding the resources to be assi the average LTE GBR load is higher than 70% and the NR PDCCH load is lower th then the sharing ratio for NR will decrease. Else, if the average LTE GBR load is e lower than 70% and the NR PDCCH load is higher than 70%, then the sharing r NR will increase. Lastly, if both the average LTE GBR load and NR PDCCH l higher than 70%, then one of the two following conditions is applied:  If the LTE GBR resource delta (n; n − 1) > 0, the sharing ratio for NR will be r  If the LTE GBR resource delta (n; n − 1) ≤ 0 and the NR PDCCH resource delt 1) > 0, the sharing ratio for NR will increase.
The second step of the algorithm is based on the load information received fr 1. The CRM then calculates the LTE-weighted load and the NR weighted load, from the LTE and NR total loads are determined. Finally, in step 3, the number of LTE subframes is calculated, taking into account the LTE and NR total loads from ste resulting number of subframes matches to a specific sharing ratio value. For downlink, depending on the traffic demands for a specific moment, the CRM receives information concerning load indication and takes a decision on the DL sharing ratio that needs to be adopted. To calculate the DL load, the weighted load (based on PRB occupancy), average LTE DSS Guaranteed Bit Rate (GBR) load, NR DSS GBR load, and NR PDCCH load need to be determined [35]. The algorithm for the sharing ratio calculation for DL is presented in Figure 6. The first step consists of the verification, from the CRM entity, of the average LTE GBR load, as well as the NR PDCCH load against a defined threshold, so that a decision can be taken regarding the resources to be assigned. If the average LTE GBR load is higher than 70% and the NR PDCCH load is lower than 70%, then the sharing ratio for NR will decrease. Else, if the average LTE GBR load is equal or lower than 70% and the NR PDCCH load is higher than 70%, then the sharing ratio for NR will increase. Lastly, if both the average LTE GBR load and NR PDCCH load are higher than 70%, then one of the two following conditions is applied:

•
If the LTE GBR resource delta (n; n − 1) > 0, the sharing ratio for NR will be reduced; • If the LTE GBR resource delta (n; n − 1) ≤ 0 and the NR PDCCH resource delta (n; n − 1) > 0, the sharing ratio for NR will increase. The second step of the algorithm is based on the load information received from step 1. The CRM then calculates the LTE-weighted load and the NR weighted load, from which the LTE and NR total loads are determined. Finally, in step 3, the number of LTE and NR subframes is calculated, taking into account the LTE and NR total loads from step 2. The resulting number of subframes matches to a specific sharing ratio value. For uplink, a similar procedure as for downlink is taken. Depending on the traffic demands for a specific moment, the CRM receives information concerning load indication and takes a decision on the UL sharing ratio that needs to be adopted. To calculate the UL load, the weighted load (based on PRB occupancy) and the average LTE DSS GBR load need to be determined. Figure 7 presents the algorithm for the UL sharing ratio calculation. For step 1, the average LTE GBR load is verified by the CRM, so a decision can be taken regarding the assignment of resources. If the average LTE GBR load is higher than 70%, then the sharing ratio for NR will be decreased. The second step of the algorithm is based on the load information received from step 1. The CRM then calculates the LTE weighted load and the NR weighted load, from which the LTE and NR total loads are determined. Finally, in step 3, the number of LTE and NR subframes is calculated, taking into account the LTE and NR total loads from step 2. The resulting number of subframes matches to a specific sharing ratio value.

Equipment and Methods
This section presents the parameters adopted for our work, as well as the scenarios tested. We considered a MIMO system, composed of one base station for LTE, one for NR, and one mobile station that is composed of a Qualcomm chipset prototype that is frequently used on commercial Samsung devices. We used both 64QAM and 256QAM modulation for the measurements. A bandwidth of 15 MHz was selected. The Absolute Radio-Frequency Channel Numbers (ARFCN) for NR were 175,800 for downlink and 166,800 for uplink. The NR-ARFCN is a code that refers to the carrier frequency to be used for both transmission directions of the radio channel and is defined in the 3GPP TS 38.104 Release 16 specification [36]. The NR-ARFCN can be converted to frequency, resulting in 75,800 = 879 MHz for downlink and 166,800 = 834 MHz for uplink. Frequency Division Duplex (FDD) was selected for all cases. We performed throughput measurements using physical and static equipment from Nokia Networks R&D laboratory, considering a Signal-to-Interference-plus-Noise Ratio (SINR) higher than 25 dB and a Reference Signal Receive Power (RSRP) higher than −70 dBm with Line of Sight (LoS) and without the presence of fading. These are standard values used at the laboratory for testing the performance of new technologies. They are considered very good radio conditions, and the reason for choosing them is to create almost ideal radio conditions in order to verify and confirm the aptness of DSS technology and its peak performance using physical measurements, as it is a technology under development and testing. We considered both NSA and SA architectures. For NSA, we measured using sharing ratio values between 20 and 70%. For SA, we measured using sharing ratio values between 30 and 70%. Table 4 below summarizes the scenario parameters adopted for the work.

Equipment and Methods
This section presents the parameters adopted for our work, as well as the scenarios tested. We considered a MIMO system, composed of one base station for LTE, one for NR, and one mobile station that is composed of a Qualcomm chipset prototype that is frequently used on commercial Samsung devices. We used both 64QAM and 256QAM modulation for the measurements. A bandwidth of 15 MHz was selected. The Absolute Radio-Frequency Channel Numbers (ARFCN) for NR were 175,800 for downlink and 166,800 for uplink. The NR-ARFCN is a code that refers to the carrier frequency to be used for both transmission directions of the radio channel and is defined in the 3GPP TS 38.104 Release 16 specification [36]. The NR-ARFCN can be converted to frequency, resulting in 75,800 = 879 MHz for downlink and 166,800 = 834 MHz for uplink. Frequency Division Duplex (FDD) was selected for all cases. We performed throughput measurements using physical and static equipment from Nokia Networks R&D laboratory, considering a Signalto-Interference-plus-Noise Ratio (SINR) higher than 25 dB and a Reference Signal Receive Power (RSRP) higher than −70 dBm with Line of Sight (LoS) and without the presence of fading. These are standard values used at the laboratory for testing the performance of new technologies. They are considered very good radio conditions, and the reason for choosing them is to create almost ideal radio conditions in order to verify and confirm the aptness of DSS technology and its peak performance using physical measurements, as it is a technology under development and testing. We considered both NSA and SA architectures. For NSA, we measured using sharing ratio values between 20 and 70%. For SA, we measured using sharing ratio values between 30 and 70%. Table 4 below summarizes the scenario parameters adopted for the work. Each network architecture type has different possible variations. The NSA architecture is based on the LTE core network and uses LTE-based interfaces. For this type of architecture, the gNodeB needs to support these interfaces and acts as a secondary node, while the eNodeB acts as a primary or master node. There are different options to deploy an NSA architecture-option 3, 3a, 3x, 4, 4a, 7, and 7a. The option used for our measurements is NSA option 3x, where the control plane is routed through the master eNodeB and the user plane is directly routed through the secondary gNodeB. The eNodeB also communicates directly with the gNodeB and both communicate directly with the Evolved Packet Core (EPC). The SA architecture has 2 options-option 2 and 5. Option 2 is that adopted for our work and, as it can be seen in Figure 8b, consists of a Next Generation Core (NGC) and a gNodeB that communicates directly with it, without needing any support of LTE structures. For both network architectures studied, we used two radio modules that have attached to them one attenuator, as the measurements were performed in a laboratory with close proximity to the mobile user. We used either 2 or 4 antennas, depending on the case studied.  Each network architecture type has different possible variations. The NSA architecture is based on the LTE core network and uses LTE-based interfaces. For this type of architecture, the gNodeB needs to support these interfaces and acts as a secondary node, while the eNodeB acts as a primary or master node. There are different options to deploy an NSA architecture-option 3, 3a, 3x, 4, 4a, 7, and 7a. The option used for our measurements is NSA option 3x, where the control plane is routed through the master eNodeB and the user plane is directly routed through the secondary gNodeB. The eNodeB also communicates directly with the gNodeB and both communicate directly with the Evolved Packet Core (EPC). The SA architecture has 2 options-option 2 and 5. Option 2 is that adopted for our work and, as it can be seen in Figure 8b, consists of a Next Generation Core (NGC) and a gNodeB that communicates directly with it, without needing any support of LTE structures. For both network architectures studied, we used two radio modules that have attached to them one attenuator, as the measurements were performed in a laboratory with close proximity to the mobile user. We used either 2 or 4 antennas, depending on the case studied.

Results and Discussion
This section presents the results obtained from our measurements. We divide it into two subsections: downlink and uplink results. In each subsection, we present results for both NSA and SA architectures. We considered sharing ratios ranging from 20 to 70% for the NSA architecture and from 30 to 70% for the SA architecture, meaning that 5G NR occupies between 20 and 70%, and 30 and 70% of the available resources, while LTE occupies between 80 and 30%, and 70 and 30% for the NSA and SA architectures, respectively. Figure 9 presents the throughput results using the DSS feature from the first four cases of Table 2, comprising the NSA architecture. The differences between the cases consist of the modulation type, that is either 64QAM or 256QAM modulation, and MIMO type, that is either 2 × 2 or 4 × 4 MIMO. Each case depicts five different curves. The green (NR only) and purple (LTE only) curves represent the values for the throughput achieved without the use of DSS technology. The yellow curve (DSS LTE + NR) is the most important one as it presents the total throughput obtained with DSS. The remaining blue (DSS LTE) and red (DSS NR) curves represent the individual throughputs for each technology while using DSS. Notice that the DSS LTE + NR throughput equals the sum of the individual DSS LTE and DSS NR throughputs. It can be observed that, for all cases, when we increased the sharing ratio, the values for the DSS NR TP also increased while the DSS LTE TP values decreased. This was an expected behavior, as the higher the sharing ratio, the more resources will be available for NR transmission and the fewer for LTE transmission. It can also be observed that from a sharing ratio of approximately 57%, meaning that NR occupies 57% of the resources while LTE occupies 43%, the individual throughput for NR surpassed the LTE throughput. In addition, it is visible that the overall throughput using DSS was slightly lower than that for the LTE or NR-only throughputs. This is understandable, as the available frame resources are shared between both technologies.

Downlink
Comparing case 1 and case 2, where the difference is the modulation type that increases from 64QAM to 256QAM modulation, it can be concluded that the major difference is on the maximum values for throughput. For case 1, depending on the sharing ratio adopted, between 90 and 100 Mbps were obtained, while for case 2, the values for throughput were approximately 120-135 Mbps. An increase of 35% was observed. A similar comparison can be conducted for cases 3 and 4. For case 3, the maximum DSS throughput values were 175-200 Mbps, while for case 4, the values varied between 240 and 260 Mbps, depending on the sharing ratio. For these cases, an increase of 37% in throughput was observed for a sharing ratio of 20%, while for a 70% sharing ratio, the throughput increase was 30% (see Table 5).

Results and Discussion
This section presents the results obtained from our measurements. We divide it into two subsections: downlink and uplink results. In each subsection, we present results for both NSA and SA architectures. We considered sharing ratios ranging from 20 to 70% for the NSA architecture and from 30 to 70% for the SA architecture, meaning that 5G NR occupies between 20 and 70%, and 30 and 70% of the available resources, while LTE occupies between 80 and 30%, and 70 and 30% for the NSA and SA architectures, respectively. Figure 9 presents the throughput results using the DSS feature from the first four cases of Table 2, comprising the NSA architecture. The differences between the cases consist of the modulation type, that is either 64QAM or 256QAM modulation, and MIMO type, that is either 2 × 2 or 4 × 4 MIMO. Each case depicts five different curves. The green (NR only) and purple (LTE only) curves represent the values for the throughput achieved without the use of DSS technology. The yellow curve (DSS LTE + NR) is the most important one as it presents the total throughput obtained with DSS. The remaining blue (DSS LTE) and red (DSS NR) curves represent the individual throughputs for each technology while using DSS. Notice that the DSS LTE + NR throughput equals the sum of the individual DSS LTE and DSS NR throughputs. It can be observed that, for all cases, when we increased the sharing ratio, the values for the DSS NR TP also increased while the DSS LTE TP values decreased. This was an expected behavior, as the higher the sharing ratio, the more resources will be available for NR transmission and the fewer for LTE transmission. It can also be observed that from a sharing ratio of approximately 57%, meaning that NR occupies 57% of the resources while LTE occupies 43%, the individual throughput for NR surpassed the LTE throughput. In addition, it is visible that the overall throughput using DSS was slightly lower than that for the LTE or NR-only throughputs. This is understandable, as the available frame resources are shared between both technologies. Comparing case 1 and case 2, where the difference is the modulation type that increases from 64QAM to 256QAM modulation, it can be concluded that the major difference is on the maximum values for throughput. For case 1, depending on the sharing ratio adopted, between 90 and 100 Mbps were obtained, while for case 2, the values for throughput were approximately 120-135 Mbps. An increase of 35% was observed. A similar comparison can be conducted for cases 3 and 4. For case 3, the maximum DSS throughput values were 175-200 Mbps, while for case 4, the values varied between 240 and 260 Mbps, depending on the sharing ratio. For these cases, an increase of 37% in throughput was observed for a sharing ratio of 20%, while for a 70% sharing ratio, the throughput increase was 30% (see Table 5). Regarding case 1 and case 3, where the difference consists of the number of transmitting and receiving antennas, 2 × 2 MIMO and 4 × 4 MIMO, respectively, an increase in all throughput values of approximately 50% could be observed, along with an increase in complexity. Figure 10 depicts the DL throughput results with the DSS feature for the first four cases from Table 2, using the SA architecture, contrary to the results from Figure 8 that made use of the NSA architecture. The main difference of both types of architectures is that NSA is an intermediary solution that is based on the LTE network, while the SA architecture does not depend in any way on the LTE network, using a Next-Generation Core (NGC) along with NR protocols. Moreover, the SA architecture leads to an improved efficiency with less complexity. Comparing case 1 and case 2, we can observe that maximum DSS throughput values varied between 80 and 90 Mbps, and 110 and 120 Mbps, respectively. The increase from one case to another was approximately 36%. For cases 3 and 4, the DSS values varied between 160 and 180 Mbps, and 200 and 240 Mbps, respectively. For these, an increase of approximately 29% was observed.  Regarding case 1 and case 3, where the difference consists of the number of transmitting and receiving antennas, 2 × 2 MIMO and 4 × 4 MIMO, respectively, an increase in all throughput values of approximately 50% could be observed, along with an increase in complexity. Figure 10 depicts the DL throughput results with the DSS feature for the first four cases from Table 2, using the SA architecture, contrary to the results from Figure 8 that made use of the NSA architecture. The main difference of both types of architectures is that NSA is an intermediary solution that is based on the LTE network, while the SA architecture does not depend in any way on the LTE network, using a Next-Generation Core (NGC) along with NR protocols. Moreover, the SA architecture leads to an improved efficiency with less complexity. Comparing case 1 and case 2, we can observe that maximum DSS throughput values varied between 80 and 90 Mbps, and 110 and 120 Mbps, respectively. The increase from one case to another was approximately 36%. For cases 3 and 4, the DSS values varied between 160 and 180 Mbps, and 200 and 240 Mbps, respectively. For these, an increase of approximately 29% was observed.

Downlink
Regarding case 2 and case 4, the increase in the DSS throughput was approximately 87% for both sharing ratios of 30% and 70%. In addition, it can be remarked that for all cases, the values for the NR-only throughput for the SA architecture were smaller than those of the NSA architecture, with a difference of around 15 Mbps for case 1, 20 Mbps for cases 2 and 3, and 40 Mbps for case 4. The reason for this is that in the SA architecture, the number of broadcast signals was higher than that in the NSA architecture. An example is the presence of System Information Block (SIB) signals, as well as paging with information regarding the cell. Regarding case 2 and case 4, the increase in the DSS throughput was approximately 87% for both sharing ratios of 30% and 70%. In addition, it can be remarked that for all cases, the values for the NR-only throughput for the SA architecture were smaller than those of the NSA architecture, with a difference of around 15 Mbps for case 1, 20 Mbps for cases 2 and 3, and 40 Mbps for case 4. The reason for this is that in the SA architecture, the number of broadcast signals was higher than that in the NSA architecture. An example is the presence of System Information Block (SIB) signals, as well as paging with information regarding the cell. Table 6 provides the percentage loss of the throughput that occurs when using DSS instead of an NR-only system. It can be observed that there was a loss in throughput values between a minimum of 10% and a maximum of 26%, depending on the sharing ratio and case adopted. The existence of a decrease in throughput was expected, as with DSS, the available resources are shared with LTE, and hence, there are fewer resources available for NR, compared to a system that is based only on NR. However, from the results obtained, for the NSA architecture, having a loss between 14 and 25% is not a considerable decrease taking in account the fact that there is no need for new dedicated spectrum to be allocated for NR, as it shares the bandwidth with LTE technology. For case 1, the average loss was 19.8% for NSA and 17.2% for SA. For case 2, the loss was 17.5% for NSA and 15.6% for SA. For case 3, we had a 21.2% loss for NSA and 19% for SA. Lastly, for case 4, the loss was 20.6% for NSA and 19.2% for SA.  Table 6 provides the percentage loss of the throughput that occurs when using DSS instead of an NR-only system. It can be observed that there was a loss in throughput values between a minimum of 10% and a maximum of 26%, depending on the sharing ratio and case adopted. The existence of a decrease in throughput was expected, as with DSS, the available resources are shared with LTE, and hence, there are fewer resources available for NR, compared to a system that is based only on NR. However, from the results obtained, for the NSA architecture, having a loss between 14 and 25% is not a considerable decrease taking in account the fact that there is no need for new dedicated spectrum to be allocated for NR, as it shares the bandwidth with LTE technology. For case 1, the average loss was 19.8% for NSA and 17.2% for SA. For case 2, the loss was 17.5% for NSA and 15.6% for SA. For case 3, we had a 21.2% loss for NSA and 19% for SA. Lastly, for case 4, the loss was 20.6% for NSA and 19.2% for SA.
We can observe that the decrease in the DSS throughput was higher with the increase in the sharing ratio. This is due to the fact that, as the sharing ratio increases (meaning that a higher number of the available resources were used for NR transmission and fewer for LTE), more synchronization signals are transmitted in the slots dedicated to NR, as well as overhead signals.  Figure 11 depicts the UL throughput results for case 5 from Table 6, using both NSA and SA architecture. Regarding the NSA architecture, it can be observed that the maximum throughput values for LTE and NR only (not considering the DSS feature) were 40 and 22 Mbps, respectively. Moreover, if we compare the results of the DSS LTE + NR and NR-only throughputs from both architecture types, we can conclude that they were similar. Therefore, there is no difference between the SA and NSA architecture, as there are no additional channels that need to be transmitted and hence occupy extra resources.

Sharing
Ratio 20% 30% 40% 50% 60% 70% 30% 40% 50% 60% 70% NSA ARCHITECTURE SA ARCHITECTURE We can observe that the decrease in the DSS throughput was higher with the increase in the sharing ratio. This is due to the fact that, as the sharing ratio increases (meaning that a higher number of the available resources were used for NR transmission and fewer for LTE), more synchronization signals are transmitted in the slots dedicated to NR, as well as overhead signals. Figure 11 depicts the UL throughput results for case 5 from Table 6, using both NSA and SA architecture. Regarding the NSA architecture, it can be observed that the maximum throughput values for LTE and NR only (not considering the DSS feature) were 40 and 22 Mbps, respectively. Moreover, if we compare the results of the DSS LTE + NR and NR-only throughputs from both architecture types, we can conclude that they were similar. Therefore, there is no difference between the SA and NSA architecture, as there are no additional channels that need to be transmitted and hence occupy extra resources.  Table 7 presents the percentage loss of throughput when using the DSS technology. We can observe that for both architecture types, a maximum loss of 25% occurred for a sharing ratio of 70%, meaning that NR occupied 70% of the available resources while LTE occupied only 30%. Therefore, we can conclude that using the DSS technology comes with a compromise of a maximum 25% loss on throughput. However, the decrease observed is not that considerable if taking into account that, instead of needing an additional 15 MHz of bandwidth, the system shares the actual 15 MHz between both technologies. Subsequently, from the operator's point of view, DSS brings the advantage of cost reduction and spectrum efficiency, while being able to present the "5G icon game" and provide coverage strategies.  Table 7 presents the percentage loss of throughput when using the DSS technology. We can observe that for both architecture types, a maximum loss of 25% occurred for a sharing ratio of 70%, meaning that NR occupied 70% of the available resources while LTE occupied only 30%. Therefore, we can conclude that using the DSS technology comes with a compromise of a maximum 25% loss on throughput. However, the decrease observed is not that considerable if taking into account that, instead of needing an additional 15 MHz of bandwidth, the system shares the actual 15 MHz between both technologies. Subsequently, from the operator's point of view, DSS brings the advantage of cost reduction and spectrum efficiency, while being able to present the "5G icon game" and provide coverage strategies.

Conclusions
In this paper, we analyzed the impact and the advantages of using the DSS technology in an LTE-NR communication system. We proposed different schemes for the resource allocation, according to the selected sharing ratio. The results obtained provided insight into the behavior of the system with DSS and showed that it is a technology that brings advantages from the operator's point of view, mainly regarding the spectrum efficiency and cost reduction. In conclusion, from the results obtained, it is clear that using the DSS technology brings a major advantage of not needing extra dedicated bandwidth for NR systems, which, from the operator's point of view, leads to an improvement of spectrum efficiency and cost reduction. We performed a comparison of the spectrum usage for LTE and NR when adopting the DSS feature and without. In addition, we measured the throughput obtained for both LTE and NR, using the proposed allocation schemes for each sharing ratio. The results obtained clearly demonstrated that even if a maximum loss of 25% on throughput is observed, there is a major advantage in using the DSS technology due to the fact that there is a cost reduction for the mobile operator alongside an optimization on the spectrum usage, due to the fact that the MNO can re-use the already existing 15 MHz bandwidth of LTE and does not need to buy any new dedicated supplementary 15 MHz for 5G services.
In conclusion, the deployment of DSS technology is useful, especially for the initial deployment of 5G NR, as the operator is able to build strategies while presenting the initial 5G picture to the consumer, through LTE already-in-use spectrum.