Resource Allocation for Cognitive LEO Satellite Systems: Facilitating IoT Communications

Due to the characteristics of global coverage, on-demand access, and large capacity, the low earth orbit (LEO) satellite communication (SatCom) has become one promising technology to support the Internet-of-Things (IoT). However, due to the scarcity of satellite spectrum and the high cost of designing satellites, it is difficult to launch a dedicated satellite for IoT communications. To facilitate IoT communications over LEO SatCom, in this paper, we propose the cognitive LEO satellite system, where the IoT users act as the secondary user to access the legacy LEO satellites and cognitively use the spectrum of the legacy LEO users. Due to the flexibility of code division multiple access (CDMA) in multiple access and the wide use of CDMA in LEO SatCom, we apply CDMA to support cognitive satellite IoT communications. For the cognitive LEO satellite system, we are interested in the achievable rate analysis and resource allocation. Specifically, considering the randomness of spreading codes, we use the random matrix theory to analyze the asymptotic signal-to-interference-plus-noise ratios (SINRs) and accordingly obtain the achievable rates for both legacy and IoT systems. The power of the legacy and IoT transmissions at the receiver are jointly allocated to maximize the sum rate of the IoT transmission subject to the legacy satellite system performance requirement and the maximum received power constraints. We prove that the sum rate of the IoT users is quasi-concave over the satellite terminal receive power, based on which the optimal receive powers for these two systems are derived. Finally, the resource allocation scheme proposed in this paper has been verified by extensive simulations.


Introduction
Internet-of-Things (IoT) applications, such as financial services, intelligent transportation, and remote sensing, are envisioned to be an important driving force of the smart society [1]. It can be seen, with the popularization of IoT applications, trillions of new devices will be connected to the networks for different application needs, such as smart wearable devices, smart homes, environmental monitoring, and so on [2]. A variety of IoT applications require IoT networks deployed remotely over large areas (e.g., environmental monitoring). However, due to the fact that the infrastructure construction is affected by the geographical environment, terrestrial wireless systems cannot achieve comprehensive coverage, especially in particular environments such as deserts, oceans, and forests [3]. As an extension and supplement of the terrestrial communication system, low earth orbit (LEO), medium earth orbit (MEO), and geostationary earth orbit (GEO) satellites can be used in areas that are not covered by terrestrial networks [4]. For years, industry and academia have focused on the GEO satellites to provide Internet access. But these highorbiting satellites face significant challenges in providing services. For instance, signals need to span a distance of 36,000 km from Earth to the satellite's return, and time delays propose a joint resource allocation scheme that is applicable to both legacy satellite users and cognitive satellite IoT users.
In this paper, we focus on an uplink cognitive LEO SatCom system to support IoT transmissions, in which the IoT communication system is served as the secondary system to share the spectrum of the legacy LEO SatCom system. In addition, the primary and secondary systems share the same receiver, i.e., the satellite base station. As the current LEO SatCom system widely uses code division multiple access (CDMA) technology [34][35][36][37], the spread spectrum signal is distributed over a wide frequency range, making it difficult to detect the power of the primary user range [38]. The collaborative underlay spectrum sharing model is more suitable for the cognitive LEO satellite IoT system than the overlay spectrum, which requires precise spectrum sensing. With perfect power control, the underlying IoT users can use the same spread spectrum as the legacy satellite users without degrading the PU's communication quality, which greatly improves spectrum efficiency. In particular, IoT performance can be improved by using CDMA to successfully transmit more packets per unit time [39] and achieve better spectrum efficiency for IoT transmission than other orthogonal channel allocations [40]. Considering the properties of IoT transmission and in order to be compatible with the existing LEO system, in this paper, we consider supporting the cognitive satellite IoT communication in the CDMA manner. We are interested in the achievable rate and resource allocation in the cognitive LEO satellite networks. Specifically, the minimum mean square error (MMSE) detector is used to recover the information from the legacy LEO satellite users and the IoT users. Due to the randomness of spreading codes, it is difficult to allocate the resources directly. Thus, we use the random matrix theory to analyze the asymptotic signal-to-interference-plus-noise ratio (SINR) and obtain the achievable rate for both IoT and legacy systems. Moreover, we aim to jointly optimize the receive power of the legacy and IoT transmissions by maximizing the sum rate of the IoT users under the condition that the legacy satellite system can meet its performance requirement. To solve the formulated problem, we prove that the sum rate of the IoT users is quasi-concave over the legacy satellite user's receive power. Based on such characteristics, we derive the optimal receive powers for these two systems. With the designed power allocation scheme, IoT users can achieve information transmission on the basis of ensuring the performance of the primary system. Finally, extensive simulation results are provided to validate the effectiveness of the designed power allocation scheme, which shows that IoT transmission can achieve information transmission on the basis of ensuring the performance of the primary system. The rest of this paper is organized as follows. In Section 2, we build up the cognitive LEO satellite communication system model. In Section 3, we derive the asymptotic SINR for the legacy and IoT systems. In Section 4, the resource allocation problem is formulated and solved. Section 5 presents extensive simulation results which verify our theoretical analysis and validate the effectiveness of the proposed scheme. Finally, the paper is concluded in Section 7.
The notations used in this paper are listed as follows. The lowercase, boldface lowercase, and boldface uppercase letters x, x, and X denote a scalar variable (or constant), vector, and matrix, respectively. CN (µ, Σ) denotes the complex Gaussian distribution with mean µ and variance Σ. Notations X T and X H denote the transpose and conjugate transpose of matrix X, respectively. The notation X * denotes the optimal value of variable X. I N denotes the N-dimensional identity matrix. Notation E(·) denotes the statistical expectation. The notation A • B denotes the Hadamard (element-wise) product.

System Model
For the cognitive LEO satellite system, there are two types of networks: the legacy satellite network and the IoT network. The LEO satellite base station serves the legacy satellite users and the cognitive IoT users simultaneously in the CDMA manner. Meanwhile, IoT users share the same spectrum as legacy satellite users. The two kinds of systems also share the same LEO satellite receiving antenna. In this setup, the cross-channel state information (C-CSI) among the two systems is easy to obtain. We assume that the satellite and all user terminals are deployed with a single antenna. In the cognitive LEO satellite system, we consider one uplink case, as shown in Figure 1. Specifically, there are U primary users in the legacy satellite network and K secondary users in the IoT network. Each user in both the legacy satellite system and the IoT system is assigned a specific random spreading code with spreading gain N. In this way, the symbol duration of IoT is the same as that of the primary satellite. In what follows, we will illustrate the channel model of the cognitive LEO satellite system and then the signal model followed by the achievable rates of both primary and secondary systems.

Channel Model
The satellite channel fading consists of two parts, including multipath propagation and shadow effect. The shadowed Rice model can effectively describe the two parts of fading effect, and has been widely applied in various frequency bands such as the UHF-band, L-band, S-band, and Ka-band channel analysis [41]. Specifically, the satellite channel fading coefficient between the satellite and the m-th user is given by where ψ m ∈ [0, 2π) is the stationary random phase and φ m is the deterministic phase of the line-of-sight (LOS) component. A and Z are both independent stationery random processes. Specifically, A is the amplitude of the scatter component following Rayleigh distributions and Z is the amplitude of LOS component following the Nakagami distribution, which is given by Considering the atmospheric effects, the m-th user overall satellite channels can be expressed as where C L = λ/(4π d 2 0 + d 2 m ) denotes the free-space path loss coefficient with wavelength λ, the height of LEO satellite d 0 , and the distance d m between the centers of the LEO satellite coverage area and m-th user.
Based on the above channel model, let h u denote the channel coefficient from the u-th legacy satellite user to the LEO satellite and f k denote the channel from the k-th IoT device to the LEO satellite. All of the channels h u and f k for u = 1, · · · , U and k = 1, · · · , K satisfy the expression of (3).

Signal Model
Denote by a u and b k the data symbol that legacy satellite user u transmits and the data symbol that IoT user k transmits, respectively. Symbol a u is spread by the spreading code s u , and symbol b k is spread by the spreading code t k . The spreading codes s u = [s u,1 , s u,2 , · · · , s u,N ] T and t k = [t k,1 , t k,2 , · · · , t k,N ] T satisfy the power constraints, i.e., where P u andP k are the maximum transmit power of the u-th legacy satellite user and the k-th IoT device, respectively.
The total received signal at the satellite receiver corresponding to the n-th spreading code can be written as, for n = 1, · · · , N, where z n is the additive white Gaussian noise with z n ∼ CN (0, σ 2 ). For simplification, the received signal (4) can be rewritten as where H 1 = [s 1 , s 2 , · · · , s U ] is N × U matrix formulated by the spreading codes of legacy satellite users, and H 2 = [t 1 , t 2 , · · · , t K ] is N × K matrix formulated by the spreading codes of IoT users, h = [h 1 , h 2 , · · · , h U ] T is U × 1 vector whose entries are the channel response of the legacy satellite users, f = [ f 1 , f 2 , · · · , f K ] T is K × 1 vector composed by the channel resonses of the IoT users, c 1 = [a 1 , a 2 , · · · , a U ] T is U × 1 vector whose entries are the data symbols the legacy satellite users transmit, c 2 = [b 1 , b 2 , · · · , b K ] T is K × 1 vector composed by the data symbols the IoT users transmit, z = [z 1 , z 2 , · · · , z N ] T is N × 1 additive white Gaussian noise vector, i.e., z ∼ CN (0, σ 2 I N ).

Achievable Rates
Suppose the MMSE detector is adopted by the LEO satellite to recover the information from both primary and secondary transmissions. For the primary transmission, let w u be the MMSE vector used to detect the u-th element of c 1 , we can get Thus, the SINR of the legacy satellite user u can be calculated as The MMSE vector w u is designed by minimizing the mean-square error (MSE) between the processed signal and the transmitted symbols. Define the MSE function for the legacy satellite user u as Then the MMSE vector w u , which minimizes the MSE function J(w u ), is represented as Based on the analysis in [42], the SINR of legacy satellite user u can be evaluated as Accordingly, the achievable rate for the u-th legacy satellite user can be written as Similarly, we use the MMER detector to recover the transmitted signal by the IoT users. Based on the above analysis, we can obtain the SINR of the IoT users. Specifically, the SINR of the k-th IoT user is given by Then, the achievable rate for the k-th IoT user can be written as From (9) and (11), we can find that the legacy satellite user's SINR depends on the spreading codes of the primary system as well as those of the secondary system. Because of the randomness of spreading codes, it is difficult to exactly calculate the SINR of the legacy satellite user and allocate resources for these two systems. Thus, in Section 3, we will analyze the asymptotic SINRs for both primary and secondary transmissions.

Asymptotic Analysis
In this section, we analyze the asymptotic SINRs for both primary and secondary systems in order to allocate the resource for the two systems.
We consider a large cognitive LEO SatCom system, in which the number of users is large, i.e., U → ∞ and K → ∞. To support a large number of users, it is reasonable to scale up N as well, i.e., N → ∞, but U/N converges to a constant parameter α 1 , which represents the legacy satellite system load. Similarly, we have that K/N converges to a constant parameter α 2 , which represents the IoT system load.
To analyze the asymptotic SINR, we first present the following proposition.
Proposition 1 (Theorem 3.1 of [42]). For a symbol-synchronous multi-access spread-spectrum system with spreading gain N, the SINR γ 1 for the 1-st user in a M user system is deterministic and approximately satisfies, where and p i denotes the received power of the user i.
From Proposition 1, the asymptotic SINR is determined by the received power for each user. Based on (13) and (14), the asymptotic SINR of the u-th legacy satellite user satisfies: As both the primary and secondary systems are based on CDMA, the method for analyzing the asymptotic SINR for the IoT users is the same as the one for the legacy satellite system. Thus, the asymptotic SINR of the k-th IoT user is given by: From (15) and (16), we find that the asymptotic SINRs for both primary and secondary systems are related to the received power for all of the links. To simplify the process of resource allocation, we assume that the received powers for legacy satellite users are the same one, which is given by q. This is achieved by perfectly power control. Similarly, the received power for the IoT users is the same one, which is given by p. Then, (15) can be simplified as which gives Similarly, (16) can be simplified as

Joint Resource Allocation in Coginitve LEO SatComm System
In this section, we will formulate and solve the resource allocation schemes to maximize the sum rate of all IoT users under some constraints. First, we will first investigate the optimal IoT receive power and the optimal joint legacy satellite use and IoT user receive powers. Then, based on the solved optimal power, we will exploit the optimal number of IoT users. Finally, we will discuss the effect of the non-synchronous between the primary and secondary systems.

Resource Allocation
To protect the legacy satellite service, we have to guarantee its SINR no less than the target value, β * , i.e., γ p ≥ β * , and guarantee the IoT user receive power no more than the limit value,P, i.e., p ≤P. Based on the above analysis, our first resource allocation problem in the CR system tries to maximize the sum rate for the IoT system, which can be formulated as P1, From (19), it is difficult to get the closed-form expression for γ s . Similarly, the closedform expression for γ p is also difficult to obtain. To overcome this problem, we provide the following lemma.

Proof. Please see Appendix A.
From Lemma 1, we can find that the objective function increases with the growth of p. However, the increase in p will decrease the asymptotic SINR of the legacy satellite users. Similar to the Lemma 1, if q 1 > q 2 , then, γ 1,p > γ 2,p . Thus, when q is small, the SINR requirement constraint, i.e., γ p ≥ β * is dominant while the power limit is not effective. So ignoring the power constraint, we can get P2: For the optimization problem P2, when p is at the maximum value, the sum rate of the IoT users will be at the maximum value, while γ p will decrease. Thus, the objective function is maximized when the equality constraint holds, i.e., γ p = β * . Accordingly, we can derive the optimal p of the IoT receive power as follows When q is large enough, the value of p has achieved the power constraint while γ s does not reach β * . So ignoring the first constraint, we can get P3: Due to the fact that γ s will increase with p, the optimal receive power for the IoT users is p =P.

Joint Resource Allocation
From (21), we know that the optimal receive power of the IoT users is relevant to the received power of the legacy satellite system. With perfect power control, we could adjust the receive power q of the legacy satellite users to maximize the sum rate of the IoT users. Thus, we formulate the joint resource allocation problem as P4: For every given q, there must be an optimal IoT user receive power p which can ensure the maximization of IoT system's sum rate. To overcome the joint resource allocation problem, we will find the influence of q to γ s with the optimal IoT user receive power p, based on which we provide the following analysis.
Proof. The proof is given in Appendix B.
Proof. The proof is given in Appendix C.
Note that the values of q 0 and q 1 will be discussed later. From Lemmas 2 and 3, we know that when q ∈ [q 0 , q 1 ), γ s increases with q, and when q ∈ [q 1 , ∞), γ s decreases with q. This indicates that the sum rate of the IoT systems is quasi-concave over the legacy satellite receive power. As a result, when q = q 1 , p =P, the sum rate of the IoT system is maximized. Meanwhile, if we have q 0 > q 1 , the IoT users can not transmit signals i.e., p = 0.
Next, we will focus on the investigation of the values of q 0 and q 1 . When the legacy satellite system cannot tolerate the interference of the IoT system, the receive power of legacy satellite system is minimal, i.e., Solving this equation for q 0 , we can get When q = q 1 , (21) and (23) both hold, which gives By solving (26), we can get the result of q 1 . As aforementioned, the optimal joint resource allocation scheme is related to the primary and secondary system loads, i.e., α 1 and α 2 . Intuitively, the growth of the number of IoT users will contribute to the new increments in the sum rate, however, leading to the increase in interference. Thus, there exists a tradeoff in the number of IoT users. In the following, we aim to exploit the optimal user number of the IoT system to maximize the sum rate of the IoT system for a given N.

Optimal IoT User Number
Here, we will illustrate the optimal IoT user number to maximize the sum rate of the IoT transmissions, subject to the receive power constraint and the primary SINR requirement constraint, which can be mathematically formulated as P5: For every given K, there exists one optimal p and q, which can ensure the maximization of the sum rate of the IoT system. When given p and q, the optimization of K involves the log function and non-closed formula of γ s . Thus, it is difficult to gain the closed-form of the optimal K when given p and q. In fact, when K is small, the sum rate of the IoT users increases with the growth of K. When K is large, the sum rate of the IoT users decreases with the growth of K since in this case, the SINR of the IoT users will dominate the sum rate of the IoT system. Based on this fact, we can apply the one-dimensional search scheme to find the optimal tuple (p * , q * , K * ) to solve the optimization problem P5. Specifically, for each K, we calculate the optimal p * and q * . By searching different K, we can gain multiple tuples (p * , q * , K). By comparing the sum rate of the IoT system with (p * , q * , K), we can obtain the optimal tuple (p * , q * , K * ). The details of the proposed algorithm for solving problem P5 are summarized in Algorithm 1. Specifically, we first initialize K and ζ, where the initial K is a small number and ζ is greater than one. Then, we iteratively update p, q, and K until the objective function gradually decreases.

Non-Synchronous Uplink
In the above, the entire analysis is that of a synchronous uplink cognitive LEO satellite spectrum sharing system. In this section, we extend to the non-synchronous case.
For the non-synchronous uplink cognitive LEO SatCom system, the total received signal at the satellite base station can be written as, where H 3 = [t L , t L+1 , · · · , t K , t 1 , t 2 , · · · , t L−1 ] is N × K matrix formulated by the spreading codes of the IoT users, by the data symbols the IoT users transmit, and L is the synchronous error between the primary and the secondary systems. Based on the discussions in Section 2, we can obtain the MMSE output for the non-synchronous uplink cognitive LEO SatCom system, which is given by From (29), we can find that although the primary and secondary systems are not synchronous perfectly, the MMSE output of the non-synchronous uplink system is similar to that of the synchronous uplink system. Thus, the analysis and scheme of resource allocation of the non-synchronous uplink system are similar to that of the synchronous uplink system and thereby omitted.

Simulations Results
In this section, simulation results are presented to evaluate the performance of the proposed cognitive LEO satellite communication system. The spreading gain for the legacy and IoT systems is set to N = 256, which is large enough to verify the asymptotic results obtained in this paper. Although the LEO satellite system involves the shadowed Rice channel, the resource allocation schemes in this paper focus on the receive signal-to-noise ratio (SNR). Due to the fact that the performance of the legacy and IoT systems is related to the SINR, here, we use the receive SNR, i.e., p σ 2 and q σ 2 , to evaluate the performance of the proposed cognitive LEO satellite communication system. Specifically, the white Gaussian noise power is normalized to σ 2 = 1. We set the target SINR threshold β * for the legacy satellite system to 5 dB. To show the effectiveness of our proposed framework and algorithms, we set two benchmarks, which are illustrated as follows: • Benchmark 1: To show the advantages of spectrum sharing, we show the performance of the legacy satellite systems without spectrum sharing, whose SINR with the MMSE detector is given by γ l = q α 1 q 1+γ l +σ 2 . Accordingly, the sum rate of the legacy satellite users is given by U N log(1 + γ l ). Firstly, we evaluate the asymptotic SINR of the legacy satellite system by comparing the simulated SINR with the theoretical SINR with the MMSE detector. We set U = 50, K = 100, and p = 20 dB. The calculation of simulated SINR is based on random spreading codes which are assigned to each user in the legacy satellite system, as shown in (9). The theoretical SINR with the MMSE detector can be calculated by (18). As shown in Figure 2, the dense small circles are the simulated SINRs, while the big circles are the theoretical SINRs for the MMSE detector. It is seen that the theoretical SINR can be considered as the statistical mean value of the simulated SINRs. Thus, it is reasonable to formulate the joint resource allocation problem based on the asymptotic SINR. In addition, we can find that the interference from the IoT users and other legacy satellite users leads to about 4 dB SINR loss with MMSE detector compared with the SNR of the effective legacy satellite user. Figures 3 and 4 present the sum rate of the IoT users and the sum rate of the legacy satellite users w.r.t. the legacy satellite receive SNR, i.e., q σ 2 , respectively. Here, we set U = 100 and K = 200. From Figure 3, it can be found that the sum rate of the IoT users is quasi-concave over the legacy satellite receive SNR, which is consistent with the results in Lemmas 2 and 3. In addition, from Figures 3 and 4, we can find that when q σ 2 < 6.5 dB, the sum rate of the IoT users is equal to 0, while the sum rate of the legacy satellite users increases with the growth of q σ 2 . The main reason is that when q σ 2 < 6.5 dB, we have q σ 2 < q 0 σ 2 , which means that the legacy satellite system cannot tolerate the interference of the IoT system. For p σ 2 = 10 dB, the sum rate of the IoT users increases with q σ 2 , when q σ 2 < 12.5 dB. In this case, the sum rate of the legacy satellite users remains unchanged, which indicates that the legacy SINR requirement constraint dominates the power allocation scheme. When the IoT receive power limit is not achieved, for each value of p σ 2 , the curves are coincident and all the interference margins can be exploited by the IoT system. Note that the interference margin refers to the tolerable interference of the legacy satellite system. When q σ 2 < 12.5 dB, the receive power limit dominates the power allocation scheme. Note that the tuning points in the two figures are related to the parameters design. With other parameters, the tuning points may change.    Figure 5 presents the sum rate of the IoT and the legacy satellite users w.r.t. the legacy satellite receive SNR, i.e., q σ 2 for different legacy user numbers withP σ 2 = 10 dB. As aforementioned, Benchmark 1 indicates the performance of the legacy satellite users without spectrum sharing. From Figure 5, it is obvious that the proposed spectrum sharing scheme has a higher sum rate, which indicates the effectiveness of the spectrum sharing scheme. Meanwhile, from this figure, we can find that the spectrum sharing scheme has greater performance gains when the number of legacy satellite users is small. The main reason is that the available interference margin increases with the decrease of the legacy user number. In addition, when q σ 2 < 6.5 dB, the two schemes have the same performance since the legacy satellite system cannot tolerate the interference of the IoT system and the sum rate of the IoT users is equal to 0.  Next, we evaluate the optimal legacy satellite receive power and the optimal sum rate of the IoT user based on the optimized joint power allocation scheme by varying the maximum receive SNR of the IoT usersP σ 2 , as shown in Figures 6 and 7. Meanwhile, to show the effectiveness of the joint resource allocation scheme, we also plot the curves of Benchmark 2 in Figure 7, where we set the q σ 2 = 20 dB. In the two figures, we set K = 150. From Figure 6, we can find that whenP σ 2 grows, the optimal receive SNR of the legacy user, i.e., q 1 σ 2 , also increases. This indicates that the sum rate of the IoT users is maximized with higher q 1 σ 2 , which is also shown in Figure 3. Meanwhile, the optimal receive SNR increase with α 1 due to less interference from the legacy satellite users. In Figure 7, we can observe that, for any α 1 , asP σ 2 increases, the optimal IoT sum rate increases since the higherP can help the IoT users to exploit more interference margin. Furthermore, we can find our proposed joint resource allocation scheme performs better than Benchmark 2 for any legacy system load, which shows the effectiveness of the joint design. In addition, the optimal IoT sum rate decreases with α 1 since the available interference margin decreases with the increase of α 1 .

10 120 25
The maximum receive SNR,P /σ 2 (dB)  The maximum receive SNR,P /σ 2 (dB) Finally, the optimal sum rate of the IoT users w.r.t. the number of the IoT users is shown in Figure 8. In this figure, we set U = 50. For every given K, the calculation of the sum rate of the IoT users is based on the optimal p and q which can ensure the maximization of the IoT sum rate. It is observed that the optimal sum rate of the IoT users is quasi-concave over the number of the IoT users since the sum rate of the IoT system equals the user number times each user's rate, and when the user number increases, this, however, decreases the SINR in (19), resulting in the decrease of each user's throughput. We also know that the user number must be an integer. Thus, we can traverse K next to the peak in Figure 8 to confirm the user number of the IoT system which can make the IoT sum rate reaches the maximum. For example, from Figure 8, we can observe that when P σ 2 = 25 dB, the optimal secondary user number is around 180. Meanwhile, we can find that in this setup, the receive SNRP σ 2 has a trivial effect on the optimal number of the IoT users due to the huge effect of the QoS requirements of the legacy satellite users.

Discussions
Due to the difficulty in launching a dedicated satellite and allocating a specified spectrum for IoT communications, we propose one framework, which uses the legacy satellite and its authorized spectrum to support the IoT transmissions. Due to the interference between the legacy satellite users and the IoT users, we propose one joint resource allocation scheme to balance the two types of transmissions, where the legacy satellite users have higher priority. Simulation results show the advantages and the effectiveness of the spectrum sharing and joint resource allocation scheme. In a nutshell, the main contributions of this paper are summarized as follows.

•
The cognitive LEO SatCom system is proposed to support IoT transmissions, which can effectively enhance the coverage and spectrum efficiency. • Due to the randomness of spreading codes, we use the random matrix theory to analyze the asymptotic SINR and obtain the achievable rate for IoT and legacy systems. • The receive power of the legacy and IoT transmissions are jointly optimized and the closed-form expressions of the optimal receive powers for these two systems are derived. • Extensive simulation results are provided to validate the effectiveness of the designed power allocation scheme, which shows that IoT devices can achieve information transmission on the basis of ensuring the performance of the primary system.

Conclusions
In this paper, we have proposed an uplink cognitive LEO SatCom system to enable IoT communication by sharing the spectrum of the legacy satellite system. Specifically, the legacy satellite users and the IoT users are served by the same LEO satellite in the CDMA manner. Considering the randomness of spreading codes, we have analyzed the asymptotic SINRs by using the random matrix theory and obtained the achievable rates for both legacy and IoT systems. We have proposed joint resource allocation schemes for the uplink cognitive LEO SatCom system, which aims to maximize the sum rate of the IoT users, with the consideration to protect the legacy primary satellite users. It has been proved that the sum rate of the IoT users is quasi-concave over legacy satellite user receive power, based on which the optimal receive powers for these two systems are derived. Simulation results have verified our theoretical analysis and validated the effectiveness of our scheme to achieve the optimal sum rate of the IoT users in the cognitive LEO SatCom system.

Appendix C. Proof of Lemma 3
When q is a large number, i.e., q ∈ [q 1 , ∞), the SINR constraint can be ignored, and thus the optimal received power of the IoT user is p =P.