Post-Quantum Cryptosystems for Internet-of-Things: A Survey on Lattice-Based Algorithms

: The latest quantum computers have the ability to solve incredibly complex classical cryptography equations particularly to decode the secret encrypted keys and making the network vulnerable to hacking. They can solve complex mathematical problems almost instantaneously compared to the billions of years of computation needed by traditional computing machines. Researchers advocate the development of novel strategies to include data encryption in the post-quantum era. Lattices have been widely used in cryptography, somewhat peculiarly, and these algorithms have been used in both; (a) cryptoanalysis by using lattice approximation to break cryptosystems; and (b) cryptography by using computationally hard lattice problems (non-deterministic polynomial time hardness) to construct stable cryptographic functions. Most of the dominant features of lattice-based cryptography (LBC), which holds it ahead in the post-quantum league, include resistance to quantum attack vectors, high concurrent performance, parallelism, security under worst-case intractability assumptions, and solutions to long-standing open problems in cryptography. While these methods offer possible security for classical cryptosytems in theory and experimentation, their implementation in energy-restricted Internet-of-Things (IoT) devices requires careful study of regular lattice-based implantation and its simpliﬁcation in lightweight lattice-based cryptography (LW-LBC). This streamlined post-quantum algorithm is ideal for levelled IoT device security. The key aim of this survey was to provide the scientiﬁc community with comprehensive information on elementary mathematical facts, as well as to address real-time implementation, hardware architecture


State-of-the-Art
Due to recent developments in the field of quantum computers, the search to build and apply quantum-resistant cryptographic algorithms brings classical cryptography to the next level [1]. Using those machines, many of today's most popular cryptosystems can be cracked by the Shor Algorithm [2]. This is an algorithm that uses quantum computation to equate the prime number phases expressed as sine waves to factor large integers, effectively solving the discreet logarithm problem that many current cryptographic algorithms are focused on [3][4][5]. Quantum computation is still in its infancy and is limited to a handful of mathematical operations that can be reliably determined by Reference [6]. We do need to build sufficient logical qubits (a logical cubit is stable over time and can be made up of hundreds or thousands of today's physical qubits) that can be used to fully break cryptographic codes [7]. In addition to all previous and continuing advances, quantumresistant cryptography algorithms need to be rigorously checked using old and current data formats or sources to make them compatible with all platforms [8].
Predominantly, state-of-the-art public key algorithms are based on related problems, three of which are at the top of the list [9]. These three types of problems are known as IoT 2021, 2 the discrete algorithm problem, the entire factoring problem, and the new pre-eminent elliptical curve discrete algorithm problem [10]. These three groups will be broken by Shor's quantum PC approximation. This is undoubtedly concerning, considering that these equations are commonly used to ensure the protected sharing of confidential information across the Internet, the development of digital signatures and the securing of other links over unsafe networks [11].
In view of the inherited shortcomings and major disadvantages involved in the implementation of an effective and smooth Quantum Key Distribution (QKD) [12], the quest for a classic, non-quantum cryptography algorithm that will operate in current real-time infrastructures is an increasingly growing field of study. These quantum robust algorithms are called Post-Quantum Cryptography (PQC) algorithms and are assumed to remain stable after the availability of functional large-scale quantum computing machines [13], as depicted in Figure 1. Every modern cryptography must be combined with existing protocols, such as transport layer security. The latest cryptosystem has to weigh: • The size of the encryption keys and the signature. • Time taken to encrypt and decrypt at either end of a contact line, or to sign messages and validate signature. • For each proposed alternative, the amount of traffic sent over the wire needed to complete encryption or decryption or to transmit a signature. Many NIST (National Institute of Standards and Technology) proposal submissions are also under review. Others have been broken or excluded from the process; some are more conservative or demonstrate how far it would be possible to advance classical cryptography so that it could not be cracked by a quantum computer at a fair expense [14]. But it is possible to categorize most cryptographic structures into these families: latticebased, multivariate, hash-based (signatures only), and code-based. These categories are discussed in Section 2. For certain algorithms, though, there is a concern that they might be too inconvenient to use in the Internet-of-Things (IoT) networks [1]. With current protocols, such as Secure Shell (SSH) or Transport Layer Security (TLS), we must also be able to integrate new cryptographic schemes. Designers of post quantum cryptosystems need to take these attributes into account for IoT use-cases in order to do so: • Latency induced by encryption and decryption at both ends of the communication line, assuming a number of devices to slow and memory limited IoT devices from large and fast servers. • For ultra low latency, limit the size of public keys and signatures. • Clear network architecture that facilitates crypt-analysis and the detection of vulnerabilities that could be exploited in a dense IoT network. • Seamless integration with the existing infrastructure.
Post-Quantum protocols include a rich collection of primitives that can be used to solve the problems presented by implementation across different computing platforms (e.g., cloud versus IoT ecosystems) and for various use cases [15][16][17]. This involves the ability to compute encrypted data by having resilient (somewhat widely described than ever before) protocols against powerful attackers based on asymmetric key cryptography (using quantum machines and algorithms) and to provide security beyond the context of classical cryptography [18]. Indeed, PQ cryptosystems are committed to strengthening the protection [19] of mission-critical infrastructures, especially in energy, medical, surveillance, space exploration, etc. Due to the flexibility and scalability of PQ cryptosystems, these algorithms are also implemented in next generation 5G/NB-IoT networks, as well as for secure communications, for electric vehicle charging infrastructure [20][21][22].
This survey has the following contributions. In Section 1, we discuss the state-of-theart of Lattice-Based Cryptography (LBC), including the review papers to date. Section 2 elaborates the wider implementation of post-quantum cryptography (PQC), including Hash-Based Signatures, Code-Based Signatures, Multivariate Cryptography, and Lattice-Based Cryptography. In Section 3, we look at the fundamental mathematics and securityproofs of LBC. Moreover, it discusses the Ajtai-Dwork, Learning with Errors (LWE), and N-th degree Truncated polynomial Ring Units (NTRU) cryptosystems in detail. The extended security proofs of LBC against quantum attacks are discussed in Section 4, whereas Section 5 deals with the implementation challenges of LBC, both at software and hardware domain for authentication, key sharing, and digital signatures. In addition, the studies are applied to the application of LBC for power-restricted IoT applications, i.e., Lightweight Lattice Cryptography (LW-LBC). To conclude the survey, we review the implementation of LBC at FPGA level for the real-time experimentation of post-quantum cryptography. The key motivation of this survey was to provide comprehensive information on the future issues of quantum robust cryptography for IoT devices through LW-LBC.

Introduction to Post-Quantum Cryptography (PQC)
The PQC algorithms, as summarized in Figure 2, are mainly implemented by either Hash-Based Signature Algorithms, Code-Based Cryptography, Multivariate Cryptography Protocols, or by Lattice-Based Cryptography. In the following section, we shall discuss the PQC algorithms briefly.

Hash-Based Signatures
A hash-based signature scheme initializes from a one-time signature (OTS), i.e., a signature scheme where each key pair only needs to be used to sign a message with [23]. If an OTS key pair signs two different notes, this cab threatens the network, and a hacker will easily fake signatures that expose the customer's personal details. Merkle used the scheme of Lamport [24] and its variations. Merkle [25,26] recommended that a binary hash tree later named Merkle tree be used to create a many-time signature scheme. The leaves are the hash values of OTS public keys in a Merkle tree. Each inner node is measured as the hash of its two child nodes concatenating. If a collision tolerant hash function is used, this ensures that all leaf nodes, i.e., all OTS public keys [27], can be authenticated using the root node.
The root node of the Merkle tree turns into a public key in a Merkle signature scheme (MSS) and the set of all OTS hidden keys becomes the secret key. Random bit strings are the hidden keys for hash-based OTS. Therefore, one can store a short seed and (re)generate the OTS secret keys using a cryptographically protected pseudo-random generator instead of storing all OTS secret keys [28]. To prevent reuse of OTS key pairs, they are used according to the order of the leaves, starting with the leftmost leaf [29]. To do this, the scheme keeps as an internal state the index of the last used OTS key pair. They are used according to the order of the leaves, starting with the leftmost node, to stop reuse of OTS key pairs [29]. In order to do this, the scheme holds the index of the last used OTS key pair as an internal condition.

Code-Based Signatures
Code-based cryptography is an upcoming contender for the diversification of today's [30] public-key cryptosystems, most of which rely on the complexities of either the factorization or the discrete logarithm problem [31]. Code-based cryptography, unlike public-key algorithms, is based on the problem of decoding unknown error-correcting codes, considered to be NP-hard [32]. There are two simple Code-based cryptography systems named after Robert McEliece [33] and Harald Niederreiter [34], their inventors. Compared to traditional cryptosystems, such as RSA [35], both share the issue of having massive key lengths, which renders their implementation impossible on embedded devices with very limited resources.
The input message is converted into a code-word for plain text encryption by either adding random errors to the message or encoding a message in the error sequence [36]. By deleting the errors or retrieving the original input message from the errors, decryption restores plain-text. It is, therefore, important to conceal the algebraic structure of the text, essentially cloaking it as an anonymous generic code [37]. An adversary understanding the particular code used will be able to decipher the message.

Multivariate Cryptography
The challenge of solving non-linear equation structures over finite fields is the foundation of Multivariate Cryptography schemes [38]. Generally speaking, seeking a solution for such structures is called a NP-complete/-hard problem [39]. Patarin's Secret Fields [40] is one of the fascinating cases, generalizing a suggestion by Matsumoto and Imai [41].
The same basic architecture is used for all Multivariate Public-Key Cryptosystems (MPKC), as they all depend on the use of multivariate polynomials over a finite field. The polynomial equations are of degree two in most cases, resulting in multivariate quadratic polynomials, which are still credited with being solved as NP-hard [42]. The MQPKC can not be solved more easily with Shor's algorithm than using a classical computer, since it does not depend on any of the hard problems that Shor's algorithms can solve, as compared to many other forms of PKC (public-key cryptography). It is also a potential candidate group for, a quantum resistant encryption scheme [42].

Lattice-Based Cryptography
Miklos Ajtai [43] first demonstrated Lattice-based algorithms, with the suggestion of designing stable cryptographic algorithms based on the hard lattice problem (NP) [44]. A lattice-based public-key encryption scheme was adopted [44], but a scheme that was sufficiently robust and proven stable was not presented until 2005, when Oded Regev proposed his scheme. This method uses both lattices and a generalization of the problem of parity learning [44]. A lattice, given in n-dimensional vector space, is a particular arrangement of points with an periodic structure and is used in a variety of fields. Lattice-based cryptographic algorithms are mostly based on either the problem with the nearest vector (CVP) or the problem with the shortest vector (SVP). In most lattice-based cryptographic algorithms, the cryptographic builders used are very time-efficient and simple, while still providing security proofs based on the worst-case hardness [45]. A number of the simple problems used in this type of cryptographic algorithms often tend to be quantum resistant, since they are not based on any of the complicated problems solved by the algorithm of Shor [46]. This results in one of a few types of algorithms that are believed to carry promise as potential candidates for post-quantum cryptography is lattice-based cryptography.
For everyday Internet communications, generic cryptographic protocols, such as TLS and HTTPs [47], ensure that the communication between the two parties (sender and receiver) are authentic and private. Certain encryption algorithms that underpin these protocols, such as RSA [48,49], Diffie-Hellman [50,51], and elliptic curve [52][53][54], all are based on hard-to-solve mathematical problems and are categorized as asymmetric cryptographic primitives [55]. The time and resources needed to address these issues are prohibitive, which ensures that data encrypted using current encryption algorithms is considered secure. Due to the fact that the quantum computers [56,57] using Shor's factorization quantum algorithm [58] can quickly solve current asymmetric cryptographic primitives. Table 1 summarizes the impact of Shor's [59] and Grover's algorithms processing on typical classical data sets or cryptosystems [60]. The table summarizes public-key cryptography and similar algorithms being demolished by the development of quantum computers, leaving only symmetric cryptography (with greater key sizes) still usable and applicable but also on a small scale [61]. Table 1. Summary of the widely deployed classical cryptographic systems and their security levels against the best pre-quantum and post-quantum attacks known [61].

Name Function Pre-Quantum Security Level
Post-Quantum Security Level Symmetric Cryptography (Private Key) AES-128 [ Several security specialists and scholars agree that the lattice-based cryptography algorithm is the path forward to deliver quantum-resistant encryption and, opposed to the other post-quantum cryptography strategies, is vigorous, as in Table 2. Latticebased cryptography uses two-dimensional algebraic constructs known as lattices [73,74], which are not easily defeated with quantum computing schemes. A lattice is an infinite arrangement of dots, and the most vital lattice-based computational problem is the Shortest-Vector Problem (SVP) [75,76], which requires finding the point in the grid that is closet to a fixed central point in the space, called the origin. This is easy to solve in a two-dimensional grid, but, as the number of dimensions increases, even a quantum machine cannot solve the problem effectively. The fact that lattice-based cryptography provides fast, quantum-safe, fundamental primitives, and enables the construction of primitives previously thought impossible, makes it the front runner candidate for IoT applications [77].

Foundations of Lattice-Based Cryptography
High dimensional geometric structures are implemented by lattice cryptography, as seen in Figure 3, to conceal or mask the original details, generating a complexity that is deemed difficult to overcome even with available fault-tolerant quantum computers without the existence of the original key. A lattice is an infinite grid of dots, often arranged in a 2-dimensional setting. LBC's security statement gives much greater faith in the long-lasting transfer of stable data in post-quantum cryptosystems that are directly based on hard lattice problems for two reasons. First of all, certain lattice-theory questions are validated to be NP-Hard [79]. NP-Hard is the non-deterministic polynomial-time hardness in computational theory that characterizes the property of a class of problems that are informally analogous to the most difficult problems in the NP solution [80]. Secondly, there is a worst-case to average-case simplification of the security of many lattice problems. This reduces the security proof requirement of a cryptosystem to a series of proof of an average-case hardness due to adaptation of the worst-case to average-case. In designing the cryptosystems to help satisfy the requirements of the [43] case, this provides greater flexibility and stability.

A Simple Lattice Model
A full-rank lattice basis B, as in Equation (1), is defined as a set of n linearly independent vectors in a vector-space of dimension n.
A lattice L B is characterized as the set of all the integral combinations of the basis B of linearly independent vectors across a vector space of dimensions n [78]. We need a succinct way to represent lattices, as in Equation (2), if we are going to use them in cryptography. For this, we use what is called a 'basis of a lattice'. A basis is a small collection of vectors that can be used to reproduce any point in grid that forms the lattice [81]. An analytically good basis are those vectors in which a given problem is easy to solve without complexities, and it is termed bad basis for those in which it is generally not easier than a random basis to solve a particular lattice problem, i.e., NP-Hard.

Computational Complexities in Lattice
As discussed previously, the evidence for cryptosystems to be secure can be provided by assuming the hardness of the certain lattice problems in the worst-case. The most well known computational problems on lattices [82] are as follows: Shortest Vector Problem (SVP): Given an irrational basis B of a lattice L = L(B), find a shortest non-zero lattice vector in the given set, i.e., v ∈ L such that v = λ(L) [83].
Closest Vector Problem (CVP): Given a lattice basis B and a target vector t (not necessarily in the lattice grid or vector set), we have to find the lattice point v ∈ L(B) closest to t in the vector space [84].
Shortest Independent Vectors Problem (SIVP): Given the lattice basis B ∈ Z n * n , find n via linearly independent lattice vectors S = [S 1 , ...., S n ], where S i ∈ L(B) for all the values of i, hence minimizing the quantity S = max i s i [75].
Bounded Distance Decoding Problem (BDDP): Given basis B of an n-dimensional lattice L = L(B) and a target point t ∈ R n with the affirmation that dist(t, L) < dΛ 1 (L)/2γ(n), calculate the distinctive lattice vector v ∈ L such that t − v > d [85].

Lattice-Based Cryptosystems
In the following section, we will address in depth the all-important lattice cryptosystems, examine their security and application of their realistic real-time problems. It has been shown that the post-quantum stable cryptosystem can be generated via the hidden hyper-plane problem (HHP) with its security proof depending on the worst case of the one-way trapdoor function [86]. Although HHP accepted the worst-case/average-case reduction, large key-sizes are involved for an acceptable security standard [87] due to the colossal cipher-text expansion. Therefore, this cryptosystem was not ever meant to replace the current cryptosystems in an optimal and realistic way. We shall outline the basics of Ajtai-Dwork cryptosystem [88], Learning with Errors (LWE) cryptosystem [89], and N-th degree Truncated (NTRU) [90]. As a first step, enlist the summary of the key generation, encryption, and decryption.

Key Generation
• Create a good basis R. • Transform the good basis R into the bad basis Q through a uni-modular transformation. • Publish the bad basis Q as public basis and keep the good basis R as private basis.

Encryption
• Choose any lattice vector w using the public basis Q and add a customized plain-text vector p to it. • Send this new vector c = w + p as the cipher-text.

Decryption
• Using the private basis, compute the closest lattice vector w to the cipher-text c. • Subtract this lattice vector w from the cipher-text to give the plain-text p = c − w.

Security Evaluation
Lattice-based cryptography offers a great deal of promise for the most realistic, stable post-quantum cryptosystem, with the worst-case/average-case minimization as seen by Ajtai and Dwork [91], along with certain lattice concerns that are shown to be NP-Hard [92]. While several lattice-based cryptosystems improve simplicity, scalability, and robustness, the computational complexity is much too high compared to the algorithms of classical cryptosystems and multivariate cryptosystems. Indeed, it would almost seem as if cryptographic research based on lattices is a race towards quantum-unbreakable security and performance, whereas cryptographic research based on multivariates is a race towards security. With the implementation of advanced q-ary lattices and the ideal lattice classes, this efficiency versus security gap is closing rapidly.

Ajtai-Dwork Cryptosystem
In the following section, we define the state-of-the-art Ajtai-Dwork cryposystem (Algorithm 1), examine its security, and, at the end of the section, we shall discuss its practical real-time implementation [93]. Decryption: the receiver evaluates s, y i . By linearity r ≈ 0, he de-crypts the ciphertext as 1, otherwise as 0.

Security Evaluation
Ajtai and Dwork evaluated the security proof of this cryptosystem through two independent methods and results [91]: • whoever can determine between the encryption of 0 and 1 can also master the art of solving the HHP with the same data. This means that breaking the semantic security of their cryptosystem is at least as hard as solving HHP (search-to-decision reduction). • starting from any algorithm that solves HHP, it is possible to implement one that efficiently solves uSVP γ , in the worst case, for some γ = poly(n).
Combining these results together, Ajtai and Dwork got a worst-case to average-case reduction, which means that breaking the cryptosystem is at least as hard as solving uSVP γ [94].

Complexity and Implementation
This initial version of the cryptosystem is very inefficient when actually applied, i.e., hard boundaries, as stated in the previous sections, despite being a groundbreaking outcome from a theoretical point of view. In 1998, a heuristic attack was demonstrated by Nguyen and Stern [95], which works efficiently for limited parameters and to recover the private key provided that the classical one is known. In this way, the researchers showed that in order to prevent crypt-analytic attacks, the n dimension in vector space should be of several hundred, concluding that Ajtai-Dwork cryptosystem is only of theoretical value without significant improvements. Ajtai proposed a more powerful implementation of the cryptosystem characterized by public keys and cipher-text sizes of O (n 2 ) and O (n) respectively in his subsequent work [Ajt05]. However, to date, no average-case to worstcase reduction is known, and, although being very similar to lattice-based protocol, it is based on a Dirichlet issue that does not seem to be connected to any known Dirichlet lattice issues [96].

Learning-With Errors Cryptosystem
In this paragraph, we define the actual LWE cryposystem (Algorithm 2) and examine its security and eventually discuss its practical real-time implementation [97,98].

Algorithm 2 Learning-with Errors (LWE) Cryptosyste
• Parameters: n, q, m positive integers, α ∈ R such that 0 < α < 1 and χ = D z , discrete distribution over Z; • Private Key: s ∈ Z n q uniformly at random; • Public Key: select m vectors a 1 , ...., a m ∈ ∑ Z n q independently according to the uniform distribution. In addition, draw e 1 , ...., e m ∈ Z from χ and get the public key {a i , b i } m i=1 , with b = a i , s + e i mod q; • Encryption: Let µ ∈ {0, 1} be the bit to encode, choose a random set S ⊂ [m], then to encrypt µ one sends (a, b) = (∑ i∈S a i ∑ i∈S b i + µ q 2 ); • Decryption: If b − a, s is close to 0 than to q 2 mod q output 0, otherwise decrypt as 1.

Security Evaluation
By analyzing encryption and decryption of LWE cryptography, we may notice that the choice of parameters is responsible for the correctness of the crytpographic protocols. For example, if µ = 0, we need χ and q to be such that b − a, s = ∑ i∈S e i < q 4 ; otherwise, the bit would be decrypted as 1. This condition can be obtained by requiring q significantly larger than the error distribution χ and m. The following set of parameters will guarantee both in order to make this cryptosystem protected and accurate at the same time [98]: q prime between 2n and 2n 2 with n in the order of hundreds. In addition, m = (1+ ∈)(n + 1) log q for an arbitrary ∈> 0, and, finally, χ = D Z,αn for α(n) = 1 √ nlog 2 n.

Complexity and Implementation
The choice of the parameters is the prime priority for the implementation of LWE. The secret and the public key sizes are respectively O(n) and O(mn)logq = O(n 2 ). Furthermore, it is possible to reduce the public key size by exploiting the set of vectors a 1 , ...., a m can be shared by all users and distributed as part of the encryption and decryption software, thus leading to the public key b 1 , ...., b m .

NTRU Encryption Scheme
The first protocol based on polynomial rings, especially on f-ideal lattices [99], is this cryptosystem. As far as output is concerned, both in terms of run times and key size, the NTRU is basically effective. Combined with the presumed protection from quantum attacks, these characteristics are the reasons why NTRU is commonly used as an alternative to RSA and ECC. In the following (Algorithm 3), we describe the original cryptosystem as it was presented and, later on, we briefly discuss subsequent works highlighting an evident trade-off between performance and security.

Algorithm 3 NTRU Encryption Scheme
• Parameters: n power of 2, f (X) = X n + 1 and q odd sufficiently large, we define R = Z[X]/ f (X) and R q = R qR ; • Private Key: s, g ∈ R short polynomial, (i.e., with small coefficients) such that s is inevertible mod q and mod 2; • Public Key: h = 2g.s 1 ∈ R q with g ∈ R short polynomial; • Encryption: choose a short e ∈ R such that e mod 2 encodes the desired bit, choose r ∈ R q randomly and compute the cipher-text c = h.r + e ∈ R q accordingly • Decryption: multiply the cipher-text with the secret key to get cs = 2gr + es ∈ R q , lift it in R as 2gr + es (possible if the following variables, i.e., g, r, e, s are short enough compared to q) and reduce it | 2 | obtaining es | 2 | and, therefore, the initial bit.

Security Evaluation
As already interpreted, neither implementation of the NTRU provided either an average-case reduction to the worst-case reduction or a more general safety proof. Unfortunately, the real-time implementation is less effective than the original scheme to get an acceptable degree of security, and this depicts the trade-off between the security level and the efficiency appraisal, which unfortunately appears to stop the rapid development of lattice-based cryptography.

Complexity and Implementation
Today, the 'NTRUEncypt' is a standard public key cryptosystem (IEEE Std. 1363.1) successfully commercialized and available under a free open source license initiative. Meanwhile, we may notice that both private and secret keys require O(nlogq) bits to be encoded to get the level of security from PQC perspective.

Lattice Reduction Algorithms
The strategies outlined in the previous section for applying the problems of LWE and NTRU, substantially based on the concepts of lattice reduction, are the strategy of creating a sufficiently orthogonal basis given the definition of a lattice. Slide decrease [100] is the lattice reduction algorithm that achieves the successful theoretical performance. However, we tend to consider the best operating algorithm experimentally, BKZ (Block Korkine Zolotarev) [101]. Given the basis for one of the lattices in vector space as described above, we need to select the block size required to retrieve the shortest vector when running BKZ (i.e., the block size is the smallest size of operating data on a computing device or memory can have). This is done following the analysis introduced in Reference [102] for the LWE and NTRU primal attacks, and the analysis done in Reference [103] for the LWE dual attack.
In exchange, BKZ uses a smaller lattice oracle to solve the Shortest Vector Problem (or SVP oracle). Several SVP algorithms can be used to instantiate this oracle, with current generations of Reference [104] filters or [105] enumeration being the two most powerful. Because we consider security in post-quantum cryptography, we also need to consider quantum algorithms, which mostly implies considering possible Grover [106] speed-ups for the algorithms as of writing [107].

Lattice Cryptography Against Quantum Attacks
In this section, we will summarize the fact thet LWC algorithm is secure against the known quantum attacks, i.e., SVP is NP-hard [108,109]. We shall show that the problems of approximating the shortest and closest vector in a lattice to within a factor of √ n lies in the NP intersect coNP [110]. Different information is available in the literature to test the security standard of LWC post-quantum cryptographic primitives [110][111][112]. Consider factoring the NP-Hard and the language to describe factoring is C={(n, c), where n has a factor ≤ C. Now, C ∈ P and the factoring is highly dependent on P [113], since N − C = P ∪ {1}, so that there would be a polynomial time algorithm for deciding whether a string is s = P or not [114]. If, under some applied conditions, we assume that C is NP complete, but, in cryptography theory, to date, there is no proof available for P = NP, it stays P = NP [115,116].

Extended Security Proof
The lattices have have been investigated extensively in mathematics, and many different problems can be explored exclusively related to lattices, such as integer programming [117], factoring polynomials with rational co-efficients [118], integer relation finding [119], integer factoring, and diophantine approximation [120,121]. Latest research on the study of lattices gained a lot of attention in the computer science community due to the fact that lattice problems were shown by Ajtai [43] to possess a particularly desirable property for cryptography: worst-case to average-case reducibility. As discussed previously in Section 2, the two problems Shortest Vector Problem (SVP) and Closest Vector Problem (CVP) have been widely studied [122][123][124]. The most important parameter of interest here is the factor of approximation β in the given basis v 1 , ...., v n of a lattice to find the shortest non-zero lattice point in the Euclidean norm in the case of SVP, whereas, given the basis v 1 , ...., v n of a lattice and a target vector v ∈ R n , find the closest lattice point to v in the Euclidean norm for CVP. The problem GapSVP β constitutes of distinguishing between the instances of SVP in which the length of the shortest vector is maximum 1 or larger than β, where β can be a constant or a fixed function of the dimension of the lattice n, whereas, for GapCVP β , basis and the extra vector v ∈ R n decodes whether the distance of v from the lattice is at most 1 or larger than β. The un-likelihood of the NP-hardness of approximating SVP and CVP within polynomial factors has also been evaluated in [125]. Here, we formulate the approximation problems associated with the shortest vector problem and the closest vector problem in terms of the following supposition or a promise problem (i.e., a generalization of a decision problem where the input is promised to belong to a particular subset of all the possible inputs of a system):

Definition 1. (approximate SVP):
The promise problem GapSVP γ (where γ ≥ 1) is a function of the dimension that is defined as follows. Instances are pairs (B, d), where B ∈ Z nxk is a lattice basis, and d is a positive number and can be expressed as:

Definition 2. (approximate CVP):
The promise problem GapCVP γ (where γ ≥ 1) is a function of the dimension that is defined as follows. Instances are triples (B, y, d), where B ∈ Z nxk is a lattice basis, y ∈ Z n a vector, and d is a positive number and can be expressed as: is a YES instance if dist(y(LB)) ≥ d, i.e., Bz − y ≤ d for some z ∈ Z n , • (B, y, d) is a NO instance if dist(y(LB)) > γd, i.e., Bz − y > γd for some z ∈ Z n .

Definition 3. (approximate CVP'):
The promise problem GapCVP γ (where γ ≥ 1) is a function of the dimension that is defined as follows. Instances are triples (B, y, d), where B ∈ Z nxk is a full rank matrix, y ∈ Z n a vector, and d is a positive number and can be expressed as: • (B, y, d) is a NO instance Bz − wy > γd for all z ∈ Z n and all w ∈ Z \{0}.
Therefore, it can be characterized that [125] GapSVP, GapCVP, and GapCVP are NP-hard for any constant factor γ ≥ 1. For LWC on the implementation of cryptographic primitives, it is well documented that the security level relies on the hardness of the above mentioned lattice problems [83,126]. For example, in cryptographic constructions based on factoring, the assumption is that it is hard to factor numbers chosen from a certain distribution, which is why it is considered as quantum-secured algorithm.

Lightweight Lattice-Based Cryptography for IoT Devices
The emergence of new edge computing platforms, such as cloud computing, softwaredefined networks, and the Internet-of-Things (IoT), calls for the adoption of an increasing number of security frameworks, which in turn require the introduction of a variety of primitive cryptographic elements, but the security is just one vector in the IoT world [127]. It is also necessary to implement those secure frameworks that consume less on-board processing, memory and power resources [128]. This presents enormous difficulties in the design and execution of new cryptographic principles in a single embodiment, as diverging priorities and restrictions are accurate for the computing platforms. This involves the development of programmable IoT hardware capable of effectively executing not only individual cryptographic algorithms [129], but complete protocols, with the subsequent task of agility design, e.g., developing computer devices that achieve the performance of Application-Specific Integrated Circuits (ASICs), while keeping some programmability level [130,131].
Recently, many researchers are investigating Lightweight Lattice-Based Cryptography (LW-LBC) [128,132], where performance evaluation is fairly measured and benchmarked in terms of low-power footprint, narrow area, lightweight bandwidth requirements and good performance. The main characteristics of post-quantum LBC that makes them well suited for IoT world are: (a) these schemes offer security proofs based on NP-hard problems with average-case to worst-case hardness; (b) secondly, the LBC implementations are noteworthy for their efficiency in addition to being quantum-age stable, largely due to their inherent linear algebra-based matrix/vector operations on integers; and, (c) third, for specialized security, LBC buildings offer expanded features, in addition to the simple classical cryptographic primitives (encryption, signatures, key exchange solutions required in a quantum era, services, such as identity-based encryption (IBE) [133], attribute-based encryption (ABE) [11], and fully homomorphic encryption (FHE)) [134]. Figure 4 depicts the communication bandwidth by calculating the data bytes of various LBC algorithms with sk, pk, and signature variants, as comprehensively analyzed in Reference [128], while the number manifested at the end of each algorithm is the level of security achieved according to the NIST standards. These security levels can be defined as: (a) Level 1: at least as hard to break as AES-128 (exhaustive key search), (b) Level 2: at least as hard to break as SHA-256 (collision search), (c) Level 3: at least as hard to break as AES-192 (exhaustive key search), (d) Level 4: at least as hard to break as SHA-384 (collision search), and (e) Level 5: at least as hard to break as AES-256 (exhaustive key search). It is also worth mentioning that this security matrix is highly dependent on the hardware/computational resources of IoT-Edge nodes in the network. It can be seen from the analysis that Dilithium algorithms have consumed high bandwidth but are unable to achieve a high level of security, whereas the Falcon algorithms have consumed less bandwidth for achieving high level of security. These algorithms are ideal for lightweight implementation of LBC in the IoT devices. Figure 5 depicts the communication bandwidths of LBC algorithms implemented with public key encryption (PKE) or with Key encapsulation mechanisms (KEM) schemes [128,135,136]. It can be seen from the results that Saber and ThreeBears variants both consume less bandwidth at diverse NIST security levels and can be considered as the suitable candidates for lightweight implementation of LBC in the IoT networks.

Hardware Implementation of Lightweight Lattice-Based Cryptography
In this section, we have discussed the hardware implementation of LW-LBC on different computational platforms [137]. Many lattice systems originally require large matrices to be stored over integer rings and are very inefficient in both run-time and storage space. The principle of replacing matrices with polynomials in integer rings over ideals enables both to be minimized. Therefore, in very effective structures, the substitution of lattices with perfect lattices occurs [136,137]. It is recommended that, for IoT devices (based on communication technologies, such as IEEE 802.11ah, 802.15.4, low-power Wi-Fi, BLE, Lo-Rawan, Sigfox, NB-IoT, etc.), that inherently have reduced computational resources, limited on-board memory, and small form-factor battery banks (based on hardware platforms, such as Raspberry Pi, Beaglebones, etc.), instead of storing huge matrices of space O(n 2 ), where n is larger than 128, it is sufficient to store just O(n log n) elements. In addition, the Fast Fourier Transform can be used effectively to multiply the elements of ideal lattices (FFT w.r.t time O(n log n) for serial architecture and O(log n) for a parallel architecture rather than complex O(n 2 ) computation. This way, the hardware resources available can be utilized to implement LW-LBC in a cost-effective way in an IoT network.
The fundamental modules of lattice-based cryptosystem that guides the actual hardware implementation are the multipliers and samplers. The primary performance bottlenecks are polynomial multiplication for perfect lattices, and matrix multiplication for regular lattices, whereas the discrete Gaussian sampling is used to sample noise and cover hidden information. In the literature, there are different algorithms for the sampler and multiplier, providing the researchers with a particular end-user application [138]. For the lightweight arithmetic implementation of LBC, matrix multiplication algorithms are adopted for regular LWE schemes, while number theoretical transform (NTT) is a safer alternative in Ring-LWE for polynomial multiplication [139]. On the other hand the dynamics of large scale implementation of IoT hardware is different. Standard LWE-based systems display a comparatively high memory foot-print when deployed due to the large key scale (hundreds of kilobytes per public key), which makes it impossible to quickly deploy standard LWE-based systems [140]. The adoption of unique ring architectures, such as Ring-LWE, provides a crucial size reduction by a factor of n compared to regular LWE, rendering Ring-LWE an outstanding candidate for resource-restricted IoT devices.
As we can see in more depth in the coming paragraph, high-performance Intel/AMD processors, which are famously equipped with Advanced Vector Extensions (AVX) and ARM/AVR micro-controllers are common software execution platforms [140]. Recently, practical software implementations of standard lattices, encryption schemes and key exchanges have been reported [141]. Other hardware platforms, such as field programmable gate arrays (FPGA) and application-specific integrated circuits (ASICs), have also been used to implement LBC. FPGAs provide flexibility and customization but not agility [142], whereas ASCIs are less power hungry, while offering compactness and design flexibility.
In this section, we summarize the practical hardware implementation of LBC by comparing the memory usage (bytes), computational time (ms) and clock cycle counts on an ARM CORTEX-M AT 168 MHz platform [128]. Table 3 depicts the hardware complexity of implementing LBC based on KEMs [143]. The statistics show that, for a limited memory footprint, Saber stands out both in terms of its resource-constrained existence but also in terms of throughput performance, while it also achieves the level-5 security according to the NIST guidelines. Therefore, it is recommended that Saber can be used as a lightweight LBC algorithm well suited of post-quantum IoT networks.  Table 4 depicts the hardware complexity of implementing LBC via signature scheme [144,145]. The data analyzed by Reference [128] depicts that signature-based schemes are computationally exhaustive as compared to KEMs schemes. Nevertheless, Dilithium performs well as compared to Falcon and qTesla. We can conclude that, for signature implementation, Dilithium can be used in post-quantum IoT networks where level-5 security is not the prime focus but the acceptable range of security is in between 1 and 3.
A perfect post-quantum cryptosystem, such as pseudorandom generators, pseudorandom functions, and digital signatures, enables to identify the best parameters. As, discussed in this section the performance of diverse PQ algorithms is based on the level of acceptable security levels. The compromise on the security level can lead to side-channel attacks in the IoT networks. The computational cycles, time, and stack (bytes) are the key parameters researchers have to take into account while designing the dense IoT networks. In lattice schemes, the problem of storage (memory) occurs when immense operations of matrices are used in an integer ring. It is, therefore, appropriate to use polynomials for the matrix multiplication of elements using Fast Fourier transformation (FFT). Although the computational time of LW-LBC is much faster than classical LBC algorithms, these algorithms still need extensive research in machine-to-machine (M2M) and industrial IoT environments with dense sensor devices in the operational technology.

Conclusions
In this survey, we discussed the practicality of post-quantum cryptography in resource constrained devices, such as Internet-of-Things. We compared the performance of diversified post-quantum key exchange schemes by analyzing the memory usage, computational time and clock cycle counts on hardware platforms. The potential arrival of quantum computation pushes for the realization and implementation of cryptographic algorithms that are quantum-resistant, among which a very promising alternative for IoT networks seems to be lattice-based cryptography (LBC). The versatile processors, i.e., FPGAs, ASICs, and Raspberry Pi, enable low-power edge devices to perform the hardest quantum encryption systems today. For lightweight implementation of LBC, the researchers are adapting advanced hardware designs based on number theoretical transformation (NTT) for postquantum realization. The updated NTT separates data from vectors and allocates portions through allocated memory with smaller foot-prints ensuring reduced energy consumption, while maintaining the desired throughput and level of security. The scalability and flexibility that can be used to optimize efficiency and security for the implementation of lightweight LBC make lattice cryptography the leading candidate for post-quantum IoT security.
Funding: This research received no external funding.

Institutional Review Board Statement: Not Applicable.
Informed Consent Statement: Not Applicable.

Data Availability Statement:
The data presented in this study are available on request from the author.