Next Article in Journal
Blockchain: Current Challenges and Future Prospects/Applications
Previous Article in Journal
Partial Pre-Emphasis for Pluggable 400 G Short-Reach Coherent Systems

Future Internet 2019, 11(12), 257; https://doi.org/10.3390/fi11120257

Article
Spectrum Management Schemes for Internet of Remote Things (IoRT) Devices in 5G Networks via GEO Satellite
Department of Computer Systems Engineering, Tshwane University of Technology, Soshanguve 0152, South Africa
*
Author to whom correspondence should be addressed.
Received: 21 October 2019 / Accepted: 18 November 2019 / Published: 11 December 2019

Abstract

:
The rapid growth of not just mobile devices but also Internet of Things (IoT) devices has introduced a new paradigm in mobile networks. This evolution and the continuous need to provide spectrum efficient, high data rates, low latency, and low energy consumption radio access networks have led to the emergence of fifth generation (5G) networks. Due to technical and economical limitations, the satellite air interface is expected to complement the 5G terrestrial air interface in the provision of 5G services including IoT communications. More importantly, it is on this premise that the 5G satellite air interface is expected to provide network access to IoT devices in rural and remote areas termed Internet of Remote Things (IoRT). While this remains an interesting solution, several radio resource management issues exist. One of them, spectrum management, in the 5G satellite as it affects IoRT communications, remains unclear. Hence, the aim of this paper is to investigate and recommend the spectrum management scheme that will be most suitable not only for Human-to-Human communications but also Machine-to-Machine communications in 5G satellite networks. In order to conduct this investigation, a new dynamic scheduling scheme that will be suitable for such a scenario is proposed in this paper. The investigation is conducted through simulations, using throughput, delay, spectral efficiency, and fairness index as the performance metrics.
Keywords:
5G; satellite; IoRT; spectrum management; packet scheduling

1. Introduction

The global societal development trends have shown an unprecedented change in the manner mobile and wireless communication systems are have been used in the recent years [1,2]. It has been predicted by the Mobile and wireless communications Enablers for Twenty-twenty (2020) Information Society (METIS) project, that by 2020, global mobile traffic alone will increase by 33 times that of the 2010 traffic [1]. In addition to the huge number of human mobile devices, the Internet of Things (IoTs) will dominate the endless list of massive wireless devices. In order to transmit the flood of data traffic with a high level of mobility, such mobile devices will require not only more efficient Radio Access Technologies but ones that will be pervasive [1,3]. Hence, the need for future wireless communication systems to implement the visions of the fifth generation (5G) Radio Access Network (RAN) systems.
No doubt, the 5G targets [4,5,6] are hugely challenging to fulfil. Hence, several key enabling technologies will be needed in order to meet up with the 5G requirements [7,8,9,10,11,12,13]. However, due to these sets of distinct technologies of the 5G RAN networks [14], the usual Radio Interference Resource Management (RIRM) schemes for 4G RANs such as the spectrum management, resource allocation, packet scheduling, handover control, user offloading, power control, and cell association will not be deemed efficient and effective enough for a 5G RAN [15,16].
Also, recent studies estimate that approximately 4 billion people still lack access to internet [17]. The economic and technical viabilities of the implementation of terrestrial global coverage remains infeasible not only due to rural and remote areas but also passengers on high-speed trains, speed boats or aircrafts. Hence, the need for the satellite segment to complement the terrestrial network infrastructure in the envisaged 5G communication systems cannot be over emphasized in order to address the limitations of the terrestrial network infrastructure. The satellite components will also be able to support new machine-type communications-based applications and the new 5G system is expected to accommodate diverse RANs to operate together, including satellites [18]. It is also envisaged that the satellite systems can support all the use case scenarios that the terrestrial counterparts will be able to provide [19]. It is indeed well recognized that one of the directions in the evolution of the 5G is towards the concept of Internet of Things (IoT) and it is the concept of smart objects that drives the implementation of this new paradigm [20]. No doubt, the satellite communications systems have the potential to play a crucial role in this 5G framework for different reasons.
Firstly, smart objects are often located over a wide geographical area, remote or inaccessible. This scenario of IoT is referred to as the Internet of (Remote) Things (IoRT). In the IoRT scenario, satellite communication systems provide a more cost-effective solution with respect to other terrestrial technologies. Secondly, several IoT and Machine-to-Machine (M2M) applications use group-based communications that allow smart objects to be grouped according to the information they receive, and task performed. In this kind of scenario, the volume of data to be transmitted in the network can be optimized when the same message is to be transmitted to the IoT/M2M devices. By exploiting the broadcast, multicast or geocast transmission capabilities of satellite systems, this kind of application can naturally be supported. Lastly, in critical applications, such as smart grid, utilities etc. where network availability is crucial within reasonable cost, an alternate or redundant connection is required. The satellite system remains a viable solution in this regard, as terrestrial redundant connections are prone to severe disruptions on the ground. In fact, IoT/M2M communication via satellite air interface is a reality and represents a great opportunity for the satellite industry [21].
Despite the prospects of the involvement of the satellite component in the provision of IoRT services in 5G networks, there are still reservations like issues of long propagation delay, power consumption, inter-operation of terrestrial and satellite segments, channel model etc. which have been raised as issues affecting satellite systems. However, recent advances in satellite communication technology have dispelled most of these misconceptions on the usage of satellite systems. Most of these issues have been addressed using new techniques or/and from architectural solutions. Satellite networks are nowadays capable of providing 0.9999 availability [21,22]. Nevertheless, as acknowledged in [21], in order to achieve an effective and reliable IoRT communications via satellite, just as the case for terrestrial networks, several issues are still open and need solutions. Some of these issues are but not limited to Radio Interference and Resource Management (RIRM) functions including spectrum management for IoRT devices in future 5G satellite networks as identified in [21].
An overview of the challenges of spectrum management for wireless IoT networks is presented and a regulatory framework is recommended in [23], the Narrowband IoT (NB-IoT) spectrum management system, which is a narrowband system based on LTE, is introduced for M2M communications in [24], while [25] investigates the effect of NB-IoT on existing LTE cellular network. While there are some notable works done on spectrum management in 4G and emerging 5G terrestrial networks, to the best of the authors’ knowledge, no spectrum management investigation or framework have been proposed in the literature for IoRT in 5G satellite networks. The peculiar nature of satellite networks and IoRT, as compared to terrestrial and general IoT devices, compels the need for a study of spectrum management in this scenario. It is on this premise that the focus of this research study is on spectrum management for IoRT devices in 5G satellite networks. A new scheduling scheme is also introduced in order to investigate the considered spectrum management schemes.
The remainder of this paper is organized as follows. Section 2 presents the system description. The considered spectrum management schemes to be investigated are presented in Section 3. Section 4 presents a new scheduling scheme and existing one that are used for the investigation. The simulation model and numerical results are presented in Section 5 and Section 6, respectively. Section 7 concludes this paper.

2. System Description

The 5G satellite Radio Access Networks (RAN) are envisaged to use Orthogonal Frequency Division Multiple Access (OFDMA) and support multiple antenna transmission schemes. Satellite technology can adopt OFDMA due to the fact that OFDMA allows flexible bandwidth operation with low complexity receivers and easily exploits frequency selectivity [26]. The Multi-User Multi-Input Multi-Output (MU-MIMO) mode that allows simultaneous transmission to multiple users is considered. The 5G New Radio (NR) base station termed gNB is expected to be also suitable for satellite network scenario and is assumed to be located on the earth station. Both the satellite gNB and the User Equipment (UE) are considered to have multiple antennas; each assuming a baseline of 2 × 2 MIMO configuration. The 5G NR modulation technique and frame structure are designed to be compatible with LTE. The 5G NR duplex frequency configuration will allow alignment among 5G NR, NB-IoT and LTE-M subcarrier grids [27]. Hence, it is expected that the 5G NR user equipment (UE) will not only exist with 5G satellite networks but also coexist with satellite LTE networks.

2.1. Satellite Air Interface

There are several types of satellite systems that can be adopted for this work, these include Low Earth Orbit (LEO), Medium Earth Orbit (MEO) and Geostationary Equatorial Orbit (GEO) satellites. Each of these satellite types comes with its own challenges. The GEO satellite can maintain constant transmission delay across multiple sites and a single GEO satellite is able to provide coverage for dense spotbeams [28]. However, GEO satellites comes with issues of long propagation delay, new technologies like edge caching can be used to mitigates this. While LEO satellite provides a lower propagation delay, it is not able to achieve a constant delay due to its constant movement and its smaller coverage area. However, the LEO satellites can be preferred due to its low propagation delay, the GEO satellites are preferred for multimedia communications due to high bandwidth and constant delay. The GEO satellites have also been used to implement 5G satellite networks [28]. The scenario considered in this paper is used to investigate the co-existence in terms of spectrum management for both broadband and IoRT use cases. It is on this premise, that a transparent GEO Satellite has been adopted for this work.
As shown in Figure 1, the satellite gNodeB located on the earth connects to UEs via the transparent GEO satellite. This setup supports simultaneous transmissions from the satellite gNodeB via the transparent GEO satellite to different UEs [29]. This transmission setup is envisaged to have feedback from the UE to the satellite gNodeB for not only link adaptation purposes but also to determine the transmission rate [30]. Hence, the UE is assumed to send Channel Quality Indicator (CQI) report via the transparent GEO satellite to the gNodeB located on the earth station at certain intervals. The reported CQI at the satellite gNodeB can then be used for transmission and scheduling purposes.
At the MAC layer of the satellite gNodeB, the scheduling scheme module schedules UEs and IoRT devices. The scheduled UEs and IoRT devices are then mapped to the available resource blocks at every Transmission Time Interval (TTI). The available resource blocks are a function of the available spectrum and the available spectrum depends on how the spectrum is being managed or shared. The size of spectrum used, and the number of antennas deployed determines the number of available resource blocks at every TTI. The allocation of available resources is implemented at every TTI.

2.2. Channel Model

The channel model that is considered in this paper is an empirical-stochastic model for Land Mobile Satellite MIMO (LMS-MIMO) used in [31,32]. The stochastic properties of this channel model are obtained from the S-band tree-lined road campaign’s measurement done in a suburban area using dual circular polarizations. The Markov model in Figure 2 is used to determine the large-scale fading (shadowing) states, which varies between low and high shadowing values for both co-polar and cross-polar channels. The four possible states in the Markov model are due to the high or low shadowing state of both the cross-polar (XP) and co-polar (CP) channels. The 4 × 4 transition matrix, P, below obtained from the measurement in [33] is used to determine the next possible channel state. State 1 is CP High XP High, State 2 is CP Low XP High, State 3 is CP High XP Low and State 4 is CP Low XP Low. The top right corner value of 0.1037 is the probability of “CP High, XP High” to “CP Low, XP Low”.
P = [ 0.6822 0.1579 0.0561 0.1037 0.2887 0.2474 0.0447 0.4192 0.1682 0.0966 0.1745 0.5607 0.0098 0.0199 0.0150 0.9554 ]
The small-scale fading is modelled using Ricean distribution. The Ricean fading for each of the MIMO channels is generated using Ricean factors. The details of how the large scale and small scale fadings are obtained are shown in [33]. The path loss (in dB) at 2 GHz in GEO satellite channel can be computed as follows:
L F s = 190.35 + 20 log ( 38500 + D 35788 )
The total loss considered in this channel model includes large scale fading, small scale fading, the path loss (LFs), and polarization loss. The inter-spotbeam interference, I, has been considered in this channel model. This interference is obtained as a result of power received from satellite gNodeBs sharing the same frequency. Details of how the interference is computed and the considered channel parameters are provided in [31]. The Signal-to-Noise plus Interference Ratio (SNIR) for each subchannel can be expressed as follows in dB:
S N I R [ d B ] = [ E I R P + G R L T o t a l ( N + I ) ] [ d B ]
The Channel Quality Indicator (CQI) is determined from the computed SNIR in (3) using the SNIR-CQI mapping used in [31] for a Block Error Rate (BLER) of 10−3. This SNIR-CQI mapping can be expressed as follows:
i f   S N I R < 3.8 ;       C Q I = 1 i f 3.8 S N I R 22.6 ;       C Q I = ( 0.55 S I N R ) + 3.45 i f   S N I R > 22.6 ;       C Q I = 15

2.3. Traffic Model

The IoRT and video streaming traffic have been considered in this paper. In a mixed traffic setup, for every 50 IoRT devices, 10 video streaming users are considered. This number increases proportionally.
The IoRT traffic is modelled using a Constant Bit Rate (CBR) traffic model with a packet size of 400 Bytes and time interval of 40 ms [34], while the realistic video trace files that are provided in [35] are used to model the video streaming traffic. The obtained video sequence of 25 frames per second has been compressed using the H.264 standard [34]. The mean bit-rate of the video source is 242 kbps.

3. Spectrum Management

The emergence of massive numbers of IoRT devices will form part of the 5G networks ecosystem despite the limited spectrum. In a satellite network, it remains unclear what the best spectrum management solution for this scenario is. In this paper, two different spectrum management schemes are investigated in order to determine the best spectrum management scheme that will ensure good network performance for IoRT devices without also compromising human-to-human communications.

3.1. Dedicated IoRT Spectrum

This spectrum management scheme dedicates two separate frequency bands for machine type communications (IoRT devices) and human to human communications (video streaming, web browsing etc.) respectively. A relatively small portion of the bandwidth is dedicated for IoRT devices and the remaining part of the bandwidth is dedicated for human type communications. Hence, the machine type communications are always guaranteed this dedicated, but relatively small frequency band (smaller proportion of Resource Blocks (RBs)).
Dedicated IoRT spectrum management
1. Initialize BIoRT = BTBH
2. Initialize SIoRT = { 1 , 2 , 3 , .. N s } and
S n = ϕ for all selected IoRT n
3. While S I o R T
4. For IoRT i = 1 to I
Compute scheduling metric for each RB j
5.  m i = a r g m a x i m i ( j )
6.  S I o R T = S I o R T j i *
7.  S n = S n + j i *
end
Where BIoRT is the bandwidth dedicated for IoRT, BT is the total bandwidth, BH is the bandwidth dedicated for human type communications, SIoRT is the available resource blocks for IoRT devices, Sn is the allocated resource blocks that have been assigned to the scheduled or selected IoRT devices, I is the number of IoRT devices and m i ( j ) is the scheduling metric for IoRT i for RB j.

3.2. Shared Spectrum for Mixed Use Case or Traffic

This spectrum management scheme allows total available bandwidth to be shared by both human type and machine type communications. This scheme allows the human mobile and IoRT devices to compete for the same pool of resource blocks and depending on the scheduling metric, the available resource blocks can be assigned to either one of the IoRT or human mobile devices. This scheme does not guarantee resource blocks to IoRT devices. Hence, the scheduling algorithm will play a vital role in ensuring that IoRT devices are to a large extent fairly treated in terms of resource allocation. However, this scheme allows unused spectrum by human type communications to be used for machine type communications.
Shared spectrum management
1. Initialize BT
2. Initialize ST = { 1 , 2 , 3 , .. N s } and
S n = ϕ for all selected UEs (including IoRT devices) n
3. While S T
4. For UE i = 1 to I
Compute scheduling metric for each RB j
5.  m i = a r g m a x i m i ( j )
6.  S T = S T j i *
7.  S n = S n + j i *
end
Where BT is the total bandwidth, ST is the available resource blocks for UE including IoRT devices, Sn is the allocated resource blocks that have been assigned to the scheduled or selected UEs, I is the number of UEs and m i ( j ) is the scheduling metric for UE i for RB j.

4. Packet Scheduling Schemes

At every scheduling interval, the packet scheduler at the Medium Access Control (MAC) layer of the satellite gNodeB selects UEs and assigns the selected UEs/IoRT devices to the available resource blocks. The scheduling metric is computed for each resource block and the UE/IoRT devices with the maximum metric is then assigned to the resource block. These metrics can be interpreted as the transmission priority of each UE on a specific resource block. One of the most popular existing packet schedulers called Modified Largest Weighted Delay First (MLWDF) together with the newly proposed scheduler are considered in this work, in order to evaluate the spectrum management schemes.

4.1. MLWDF Scheduler

The Modified Largest Weighted Delay First scheduler was developed to support multiple RT services with different Quality of Service (QoS) requirements [36]. This scheduler has been used in Satellite High Speed Data Packet Access networks [37] and Satellite Long Term Evolution networks [38]. The scheduler uses the waiting time of the Head of Line packet to shape the behavior of the Proportional Fair (PF) scheduling scheme. The MLWDF scheduler defines its metric as follows:
m i , j = R i , j ( t ) T i ( t ) ( log δ i ) τ i D H O L , i ( t )
where τ i is the delay deadline for UE/IoRT device i, D H O L , i ( t ) is the waiting time of the Head of Line packet in UE/IoRT queue i, R i , j ( t ) is the instantaneous data rate of UE/IoRT device i for resource block j at current Transmission Time Interval (TTI) t, T i ( t ) is the average data rate of UE/IoRT device i before TTI t, and δ i is the maximum probability that the D H O L , i exceeds τ i in the form:
P r ( D H O L , i > τ i ) δ i

4.2. Proposed Scheduler (CAQDB)

Future networks like 5G will serve not only a mixed traffic environment, but also different use cases. These different use cases and traffic have different requirements. It is therefore important to have schedulers that will be able to dynamically fulfil the requirements in terms of throughput and QoS of these different use cases without compromising spectral efficiency. It is on this premise that Channel Aware Queue and Delay Based (CAQDB) scheduler is proposed. The CAQDB scheduler is developed to provide an acceptable trade-off among the system throughput, QoS (delay) and fairness irrespective of the use case. The proposed algorithm uses a cross-layer approach that allows interactions between different layers. From the physical layer the scheduler uses the data rate selected according to the CQI received from the UE/IoRT device. The scheduler does not only use the delay of the Head of Line packet but also instantaneous queue length from the MAC layer. The scheduler also uses the packet’s set delay deadline depending on the traffic type to make scheduling decisions. The CAQDB scheduler implements its metric as follows:
m i , j = R i , j ( t ) T i ( t ) log ( Q i ( t ) τ i D H O L , i ( t ) )
Other definitions are the same as previously used and Q i ( t ) is the instantaneous queue size of UE/IoRT device i at TTI t. The CAQDB considers not only the delay but also the queue status in making scheduling decision, this ensures IoRT devices, that are not delay sensitive, are not starved and are given fair allocation of the available resources.

5. Simulation Model

In order to compare the two spectrum management schemes and also investigate the impact of using the appropriate scheduler on the performance of the spectrum management schemes, an open source network simulator that is able to provide performance evaluation of a realistic 4G and beyond network is used in this work. It is an event driven based simulator and developed in C++ [34]. The simulator has been adapted for the satellite scenario. There are two scenarios: Scenario A considers a total bandwidth of 10 MHz for shared spectrum and bandwidth of 3 MHz for IoRT devices for dedicated spectrum which is equivalent to one-third of the bandwidth of shared spectrum and Scenario B considers a total bandwidth of 15 MHz for shared spectrum and a bandwidth of 5 MHz for IoRT devices for dedicated spectrum which is one-third of the bandwidth used for shared spectrum. The traffic and channel models presented in previous sections have been considered for this simulation. The scheduling and channel reporting intervals of 1 and 100 TTI, respectively, have been adopted. This channel reporting interval has been adopted to utilize the limited satellite power since the satellite has a slow varying channel. The Equivalent Isotropically Radiated Power (EIRP) for the GEO satellite and UE are 63 dBm and 23 dBm respectively. The channel noise of −208.1 dBm in 180 kHz bandwidth subchannel and polarization loss of 3.5 dB are adopted. The details of the other parameters considered for the simulations are presented in Table 1.

6. Simulation Results

The results obtained from the simulations show the performances of the two spectrum management schemes using two different packet scheduling algorithms (MLWDF and CAQDB). As shown in Figure 3, the throughput of IoRT devices traffic is much better using a shared spectrum management scheme as compared to dedicated spectrum management. Though, as the number of IoRT devices increases, the throughput performance of the shared spectrum management drops but the performance of the shared spectrum management is still at least approximately 4 Mbps better than that of the dedicated spectrum management. The two schedulers (MLWDF and CAQDB) produces a similar performance for both dedicated and shared spectrum management schemes, though, the proposed CAQDB scheduler edges the MLWDF when using shared spectrum management scheme. The drop in throughput performance using shared spectrum management scheme as the number of IoRT devices increase is a result of more demand from the video streaming traffic and limited spectrum. However, with more spectrum of 15 MHz in scenario B, as shown in Figure 4, using the proposed CAQDB scheduler, the throughput performance of the shared spectrum management scheme is consistently above 6 Mbps even as the number of IoRT devices increase. This cannot be said of the MLWDF scheduler due to drop in throughput performance from IoRT devices of 200 and above. The CAQDB scheduler can be said to produce this performance because unlike MLWDF it does not only considers closeness of the packet’s delay to the delay deadline, but also the state of the queue. By only considering the delay disadvantages the IoRT traffic, as they are mostly less delay sensitive traffic.
As shown in Figure 5, the average delay performances of the shared spectrum management scheme for both schedulers are better than the dedicated spectrum management across the varying number of IoRT devices. Both schedulers have a similar delay performance for both spectrum management schemes in scenario A.
However, as shown in Figure 6, in scenario B, both spectrum management schemes produce a very close average delay performance up to 150 IoRT devices for both schedulers. But, as the number of IoRT increases from 150, the shared spectrum management for both schedulers produces a better average delay performance up till 250 IoRT devices. From 250 to 300 IoRT devices, the average delay performance of shared spectrum management scheme using MLWDF becomes increasingly poorer until it reaches the same level of performance as the dedicated spectrum management scheme but the proposed CAQDB scheduler is able to maintain a better average delay performance. The average delay performance results also confirm the impact of packet scheduler on the performance of spectrum management schemes.
The fairness index performances for both scenarios A and B are presented in Figure 7 and Figure 8, respectively. In scenario A as shown in Figure 7, though, all the spectrum management schemes have the same fairness index for 50 IoRT devices but as the IoRT devices increase, the shared spectrum management scheme for both schedulers performed better than the dedicated spectrum management scheme. In addition, the CAQDB scheduler performed better than MLWDF scheduler for both shared and dedicated spectrum management scheme. In scenario B, as depicted in Figure 8, the dedicated spectrum management scheme edges the shared management scheme for both schedulers and the CAQDB scheduler edges the MLWDF scheduler fairness index performance for the dedicated spectrum management scheme. Even though, the fairness index performance trend for both scenarios are different, the impact of scheduler on the spectrum management scheme performance is further established from these set of results.
The overall spectral efficiency results of the two spectrum management schemes are presented in Figure 9 and Figure 10. As shown in Figure 9 and Figure 10, the spectral efficiency performance of the shared spectrum management scheme for both schedulers is much better as compared to the dedicated spectrum management scheme across varying number of IoRT devices. The same performance comparison is experienced in both scenarios A and B. Hence, it can be deduced that the shared spectrum management scheme allows the spectrum to be fully utilized as a result of mixed use case scenario, such that whenever the spectrum is not been used by IoRT devices, the video streamers are utilizing the spectrum and vice versa. The performance of the two schedulers are very close for the dedicated spectrum management for both scenarios, however, the same can only be said for shared spectrum management scheme in scenario A. In scenario B, the proposed CAQDB scheduler produces a better spectral efficiency from 200 IoRT devices upward but close performance below 200 IoRT devices. This further shows that the scheduler has an impact on the performance of a shared spectrum management scheme, especially as the number of IoRT devices increase.

7. Conclusions

This paper presents an investigation of two spectrum management schemes for IoRT devices communications in 5G satellite networks. In addition, a newly proposed scheduler called CAQDB scheduler is presented and compared with existing MLWDF scheduler in order to study the impact of scheduling algorithms on the performance of spectrum management schemes especially in shared spectrum management schemes, where different traffic or use cases are expected. From the presented results, the shared spectrum management scheme produces a better performance in almost all the performance metrics for the two scenarios considered. Furthermore, it can be concluded that the packet scheduler has an impact on the performance of spectrum management scheme especially the shared spectrum management scheme, as the proposed CAQDB scheduler provides better performance compared to the MLWDF scheduler, mainly when the shared spectrum management is used. In conclusion, the shared spectrum management scheme is recommended provided an appropriate scheduling algorithm like the proposed CAQDB, that will consider not only the delay as QoS factor but also the state of queue, is used. For future work, a more dynamic spectrum management scheme can be considered in the investigation and more diverse use cases will be considered.

Author Contributions

Conceptualization, G.A. and P.O.; methodology, G.A. and P.O.; software, G.A.; validation, G.A. and P.O.; formal analysis, G.A. and P.O.; writing—original draft preparation, G.A.; writing—review and editing, G.A. and P.O.; supervision, P.O.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. METIS. Mobile and Wireless Communications Enablers for the 2020 Information Society [Online]. 2015. Available online: www.metis2020.com (accessed on 15 April 2019).
  2. Ericsson. 5G radio access. Ericsson Rev. 2014, 6, 1–8. [Google Scholar]
  3. Lien, S.Y.; Chen, K.C.; Liang, Y.C.; Lin, Y. Cognitive radio resource management for future cellular networks. IEEE Wirel. Commun. 2014, 21, 70–79. [Google Scholar] [CrossRef]
  4. Andrews, J.G.; Buzzi, S.; Choi, W.; Hanly, S.V.; Lozano, A.; Soong, A.C.; Zhang, J.C. What will 5G be? IEEE J. Sel. Areas Commun. 2014, 32, 1065–1082. [Google Scholar] [CrossRef]
  5. Osseiran, A.; Boccardi, F.; Braun, V.; Kusume, K.; Marsch, P.; Maternia, M.; Queseth, O.; Schellmann, M.; Schotten, H.; Taoka, H.; et al. Scenarios for 5G mobile and wireless communications: The vision of the METIS project. IEEE Commun. Mag. 2014, 52, 26–35. [Google Scholar] [CrossRef]
  6. Simsek, M.; Aijaz, A.; Dohler, M.; Sachs, J.; Fettweis, G. 5G-enabled tactile Internet. IEEE J. Sel. Areas Commun. 2016, 34, 460–473. [Google Scholar] [CrossRef]
  7. Niu, Y.; Li, Y.; Jin, D.; Su, L.; Vasilakos, A.V. A survey of millimetre wave communications (mmWave) for 5G: Opportunities and challenges. Wirel. Netw. 2015, 21, 2657–2676. [Google Scholar] [CrossRef]
  8. Larsson, E.G.; Edfors, O.; Tufvesson, F.; Marzetta, T.L. Massive MIMO for next generation wireless systems. IEEE Commun. Mag. 2014, 52, 186–195. [Google Scholar] [CrossRef]
  9. Rusek, F.; Persson, D.; Lau, B.K.; Larsson, E.G.; Marzetta, T.L.; Edfors, O.; Tufvesson, F. Scaling up MIMO: Opportunities and challenges with very large arrays. IEEE Signal Process. Mag. 2013, 30, 40–60. [Google Scholar] [CrossRef]
  10. Bhushan, N.; Li, J.; Malladi, D.; Gilmore, R.; Brenner, D.; Damnjanovic, A.; Sukhavasi, R.T.; Patel, C.; Geirhofer, S. Network densification: The dominant theme for wireless evolution into 5G. IEEE Commun. Mag. 2014, 52, 82–89. [Google Scholar] [CrossRef]
  11. Zhang, C.J.; Ma, J.; Li, G.Y.; Yu, W.; Jindal, N.; Kishiyama, Y.; Parkvall, S. New waveforms for 5G networks. IEEE Commun. Mag. 2016, 54, 64–65. [Google Scholar] [CrossRef]
  12. Sabharwal, A.; Schniter, P.; Guo, D.; Bliss, D.W.; Rangarajan, S.; Wichman, R. In-band full-duplex wireless: Challenges and opportunities. IEEE J. Sel. Areas Commun. 2014, 32, 1637–1652. [Google Scholar] [CrossRef]
  13. Mahmood, N.H.; Berardinelli, G.; Tavares, F.M.L.; Mogensen, P. On the potential of full duplex communication in 5G small cell networks. In Proceedings of the 2015 IEEE 81st Vehicular Technology Conference (VTC Spring), Glasgow, UK, 11–14 May 2015; pp. 1–5. [Google Scholar]
  14. Hossain, E.; Hasan, M. 5G cellular: Key enabling technologies and research challenges. IEEE Instrum. Meas. Mag. 2015, 18, 11–21. [Google Scholar] [CrossRef]
  15. Hassan, M.; Hossain, E. Distributed resource allocation in 5G cellular networks. In Towards 5G: Applications, Requirements and Candidate Technologies; Vanithamby, R., Telwar, S., Eds.; Wiley: Hoboken, NJ, USA, 2015; pp. 1–26. [Google Scholar]
  16. Hossain, E.; Rasti, M.; Tabassum, H.; Abdelnasser, A. Evolution towards 5G multi-tier cellular wireless networks: An interference management perspective. IEEE Wirel. Commun. 2014, 21, 118–127. [Google Scholar] [CrossRef]
  17. Report on World Internet Users Statistics and 2018 World Population Stats. Available online: https://www.internetworldstats.com/stats.htm (accessed on 19 November 2019).
  18. Giambene, G.; Kota, S.; Pillai, P. Satellite-5G integration: A network perspective. IEEE Netw. 2018, 32, 25–31. [Google Scholar] [CrossRef]
  19. Kota, S.; Giambene, G. Satellite 5G: IoT use case for rural areas applications. In Proceedings of the Eleventh International Conference on Advances in Satellite and Space Communications-SPACOMM 2019, Valencia, Spain, 24–28 March 2019. [Google Scholar]
  20. Stankovic, J.A. Research directions for the Internet of Things. IEEE Internet Things J. 2014, 1, 3–9. [Google Scholar] [CrossRef]
  21. De Sanctis, M.; Cianca, E.; Araniti, G.; Bisio, I.; Prasad, R. Satellite communications supporting internet of remote things. IEEE Internet Things J. 2016, 3, 113–123. [Google Scholar] [CrossRef]
  22. Evans, B.; Onireti, O.; Spathopoulos, T.; Imran, M.A. The role of satellites in 5G. In Proceedings of the 2015 23rd European Signal Processing Conference (EUSIPCO), Livorno, Italy, 8–10 September 2014; pp. 2756–2760. [Google Scholar] [CrossRef]
  23. Babačić, E.; Veljović, Z.; Pejanović-Đurišić, M. Radio-frequency spectrum management in wireless IoT networks. In Proceedings of the 2017 25th Telecommunication Forum (TELFOR), Belgrade, Serbia, 21–22 November 2017; pp. 1–4. [Google Scholar] [CrossRef]
  24. Ratasuk, R.; Vejlgaard, B.; Mangalvedhe, N.; Ghosh, A. NB-IoT system for M2M communication. In Proceedings of the 2016 IEEE Wireless Communications and Networking Conference, Doha, Qatar, 3–6 April 2016; pp. 1–5. [Google Scholar] [CrossRef]
  25. Oh, J.; Song, H. Study on the Effect of LTE on the coexistence of NB-IoT. In Proceedings of the 2018 Tenth International Conference on Ubiquitous and Future Networks (ICUFN), Prague, Czech Republic, 3–6 July 2018; pp. 610–612. [Google Scholar] [CrossRef]
  26. Lin, X.; Hofström, B.; Wang, E.; Masini, G.; Maattanen, H.L.; Rydén, H.; Sedin, J.; Stattin, M.; Liberg, O.; Euler, S.; et al. 5G new radio evolution meets satellite communications: Opportunities, challenges, and solutions. arXiv 2019, arXiv:abs/1903.11219. [Google Scholar]
  27. Minoli, D.; Occhiogrosso, B. Practical aspects for the integration of 5G networks and IoT applications in smart cities environments. Wirel. Commun. Mob. Comput. 2019, 2019, 5710834. [Google Scholar] [CrossRef]
  28. Völk, F.; Liolis, K.; Corici, M.; Cahill, J.; Schwarz, R.T.; Schlichter, T.; Troudt, E.; Knopp, A. Satellite integration into 5G: Accent on first over-the-air tests of an edge node concept with integrated satellite backhaul. Future Internet 2019, 11, 193. [Google Scholar] [CrossRef]
  29. Marchese, M.; Moheddine, A.; Patrone, F. IoT and UAV integration in 5G hybrid terrestrial-satellite networks. Sensors 2019, 19, 3704. [Google Scholar] [CrossRef]
  30. Chiti, F.; Fantacci, R.; Pierucci, L. Energy efficient communications for reliable IoT multicast 5G/satellite services. Future Internet 2019, 11, 164. [Google Scholar] [CrossRef]
  31. Aiyetoro, G.; Giambene, G.; Takawira, F. Packet scheduling in MIMO satellite long term evolution networks. Int. J. Satell. Commun. Network. 2017, 35, 67–88. [Google Scholar] [CrossRef]
  32. Aiyetoro, G.; Takawira, F.; Walingo, T. Near-optimal packet scheduling scheme in satellite LTE networks. IET Commun. 2017, 11, 2311–2319. [Google Scholar] [CrossRef]
  33. King, P.R.; Brown, T.W.C.; Kyrgiazos, A.; Evans, B.G. Empirical-stochastic LMS-MIMO channel model implementation and validation. IEEE Trans. Antennas Propag. 2012, 60, 606–614. [Google Scholar] [CrossRef]
  34. Piro, G.; Grieco, L.A.; Boggia, G.; Capozzi, F.; Camarda, P. Simulating LTE cellular systems: An open-source framework. IEEE Trans. Veh. Technol. 2011, 60, 498–513. [Google Scholar] [CrossRef]
  35. Video Trace Library. Available online: http://trace.eas.asu.edu/ (accessed on 19 November 2019).
  36. Afroz, F.; Sandrasegaran, K.; Ghosal, P. Performance analysis of PF, M-LWDF and EXP/PF packet scheduling algorithms in 3GPP LTE downlink. In Proceedings of the 2014 Australasian Telecommunication Networks and Applications Conference (ATNAC), Melbourne, Australia, 26–28 November 2014; pp. 87–92. [Google Scholar] [CrossRef]
  37. Aiyetoro, G.; Takawira, F. A novel packet scheduling scheme in HSDPA via GEO Satellite. In Proceedings of the AFRICON 2009, Nairobi, Kenya, 23–25 September 2009; pp. 1–6. [Google Scholar] [CrossRef]
  38. Aiyetoro, G.; Takawira, F. A cross-layer based packet scheduling scheme for multimedia traffic in satellite LTE networks. In Proceedings of the 2014 6th International Conference on New Technologies, Mobility and Security (NTMS), Dubai, United Arab Emirates, 30 March–2 April 2014; pp. 1–6. [Google Scholar] [CrossRef]
Figure 1. 5G satellite network system architecture.
Figure 1. 5G satellite network system architecture.
Futureinternet 11 00257 g001
Figure 2. Four-state Markov model of a Land Mobile Satellite (LMS-MIMO) channel.
Figure 2. Four-state Markov model of a Land Mobile Satellite (LMS-MIMO) channel.
Futureinternet 11 00257 g002
Figure 3. Throughput of internet of Remote Things (IoRT) devices in Scenario A.
Figure 3. Throughput of internet of Remote Things (IoRT) devices in Scenario A.
Futureinternet 11 00257 g003
Figure 4. Throughput of IoRT devices in Scenario B.
Figure 4. Throughput of IoRT devices in Scenario B.
Futureinternet 11 00257 g004
Figure 5. Average delay of IoRT devices in scenario A.
Figure 5. Average delay of IoRT devices in scenario A.
Futureinternet 11 00257 g005
Figure 6. Average delay of IoRT devices in scenario B.
Figure 6. Average delay of IoRT devices in scenario B.
Futureinternet 11 00257 g006
Figure 7. Fairness index for IoRT devices in scenario A.
Figure 7. Fairness index for IoRT devices in scenario A.
Futureinternet 11 00257 g007
Figure 8. Fairness index for IoRT devices in scenario B.
Figure 8. Fairness index for IoRT devices in scenario B.
Futureinternet 11 00257 g008
Figure 9. Spectral efficiency in scenario A.
Figure 9. Spectral efficiency in scenario A.
Futureinternet 11 00257 g009
Figure 10. Spectral efficiency in scenario B.
Figure 10. Spectral efficiency in scenario B.
Futureinternet 11 00257 g010
Table 1. Simulation parameters.
Table 1. Simulation parameters.
ParametersValue
Simulation Time500 seconds
CQI Reporting Interval100 TTI
Channel Model4 state Markov model
UE Speed3 km/h
IoRT Traffic ModelConstant Bit Rate
Video Traffic ModelTrace-based @ 242 kbps
SchedulerMLWDF & CAQDB
Scenario AIoRT (3 MHz) & Mixed (10 MHz)
Scenario BIoRT (5 MHz) & Mixed (15 MHz)
Back to TopTop