Next Article in Journal
The Curve Estimation of Combined Truncated Spline and Fourier Series Estimators for Multiresponse Nonparametric Regression
Previous Article in Journal
Reliability Properties of the NDL Family of Discrete Distributions with Its Inference
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Adaptive Cuckoo Search-Based Optimization Model for Addressing Cyber-Physical Security Problems

1
Department of Computer Science, Faculty of Computers and Informatics, Zagazig University, Zagazig 44519, Egypt
2
Prince Mohammad Bin Fahd University, Al Khobar 31952, Saudi Arabia
3
School of Engineering & Information Technology, UNSW, Canberra, ACT 2620, Australia
*
Authors to whom correspondence should be addressed.
Mathematics 2021, 9(10), 1140; https://doi.org/10.3390/math9101140
Submission received: 10 March 2021 / Revised: 5 May 2021 / Accepted: 7 May 2021 / Published: 18 May 2021
(This article belongs to the Section Mathematics and Computer Science)

Abstract

:
One of the key challenges in cyber-physical systems (CPS) is the dynamic fitting of data sources under multivariate or mixture distribution models to determine abnormalities. Equations of the models have been statistically characterized as nonlinear and non-Gaussian ones, where data have high variations between normal and suspicious data distributions. To address nonlinear equations of these distributions, a cuckoo search algorithm is employed. In this paper, the cuckoo search algorithm is effectively improved with a novel strategy, known as a convergence speed strategy, to accelerate the convergence speed in the direction of the optimal solution for achieving better outcomes in a small number of iterations when solving systems of nonlinear equations. The proposed algorithm is named an improved cuckoo search algorithm (ICSA), which accelerates the convergence speed by improving the fitness values of function evaluations compared to the existing algorithms. To assess the efficacy of ICSA, 34 common nonlinear equations that fit the nature of cybersecurity models are adopted to show if ICSA can reach better outcomes with high convergence speed or not. ICSA has been compared with several well-known, well-established optimization algorithms, such as the slime mould optimizer, salp swarm, cuckoo search, marine predators, bat, and flower pollination algorithms. Experimental outcomes have revealed that ICSA is superior to the other in terms of the convergence speed and final accuracy, and this makes a promising alternative to the existing algorithm.

1. Introduction

With the norm of cyber-physical systems (CPS), cyber defense systems such as intrusion detection and threat intelligence, which deal with data sources under the constraints of nonnormality and nonlinearity, should be designed to handle these constraints and produce accurate outcomes [1,2]. These models have been developed using nonlinear equation systems (NESs) [3], which need to be accurately solved in reasonable time [4]. Therefore, to overcome NESs, several numerical methods, including the Newton-type method [5] and the iterative and recursive methods [6], have been proposed. However, most of those methods cannot estimate the roots of NESs with a complex nature due to their sensitivity to picking the initial guess of the solutions, which significantly affects the obtained outcomes and stability of those methods [4]. Therefore, the only way to overcome those drawbacks and to estimate the optimal roots is to use evolutionary and meta-heuristic algorithms, which have gained significant attention over the last decades due to their superiority in terms of local minima avoidance, convergence speed, and reaching the optimal solution in a reasonable time.
The evolutionary algorithms (EAs) and swarm algorithms (SAs) have achieved significant achievements in real-world optimization problems [7,8,9,10,11,12,13,14,15,16,17,18], particularly the convex, discontinuous nonlinear optimization problem [11,19,20]. Therefore, they have been widely used in the literature for solving the NESs. Unfortunately, the existing algorithms still suffer from local minima and convergence speed to the optimal root. This causes two problems when solving NESs: (1) consuming several numbers of function evaluations before reaching the optimal root in some cases, and (2) the algorithms are unable to find the optimal root with an increasing number of function evaluations due to the weak ability of the algorithms in exploring as much of the search space as possible while avoiding getting stuck in local minima problems. In cybersecurity, data distributions of intrusion detection and threat models often demand nonlinear and non-Gaussian systems that can discriminate small variations between normal and suspicious behaviors [21]. In this paper, the cuckoo search algorithm (CSA) is improved in an effective way to help it avoid those two problems while solving NESs. The algorithm is named as the improved CSA (ICSA). ICSA was extensively validated using 34 well-known NES cases and compared with some recently published, well-established optimization algorithms, namely the slime mould algorithm (SMA, 2020) [22], marine predators algorithm (MPA, 2020) [23], Bat algorithm (BA, 2012) [24], salp swarm algorithm (SSA, 2017) [25], standard cuckoo search algorithm (CSA, 2009) [26], and flower pollination algorithm (FPA, 2012) [27], under various statistical analyses that can flexibly fit nonlinear distributions of CPS-driven data sources, efficiently enhancing the discovery of anomalous events. The experiments show that our improved algorithm has significant performance for most test cases concerning the convergence speed and final accuracy in comparison to the abovementioned algorithms. The main contributions in this research are as follows:
(a)
Improving the classical CSA using an effective strategy called the convergence improvement strategy (CIS) to produce a new variant able to accurately tackle NESs. This variant was named ICSA.
(b)
The experiments conducted on 34 well-known NES cases to assess the performance of this variant, in addition to comparing its performance with 6 well-established optimization algorithms, show the efficacy of this variant in terms of the convergence speed and final accuracy for most test cases.
The remainder of this paper is organized as follows: Section 2 presents the literature review, Section 3 overviews the standard CSA, Section 4 extensively describes our proposed work, and Section 5 shows our experimental outcomes and some discussions. Finally, Section 6 shows some conclusions devised from our proposed work and discusses our future work.

2. Literature Review

This section is divided into two parts. The first part will define the problem formulation of the NES, and the second reviews the EAs and the SAs proposed in the literature to tackle the NESs.

2.1. Problem Description

Generally, nonlinear equation systems are mathematically formulated as follows:
S ( x ) = { f 1 ( x 1 ,   x 2 ,   x 3 , ,   x d ) = 0 f 2 ( x 1 ,   x 2 ,   x 3 , ,   x d ) = 0 f 3 ( x 1 ,   x 2 ,   x 3 , ,   x d ) = 0 . . . f n ( x 1 ,   x 2 ,   x 3 , ,   x d ) = 0
where d denotes the number of decision variables of the equation; n refers to the number of equations; x is a vector of d -dimensions and includes a solution to the NES, where each dimension within this solution must be subject to its search boundary: lower bound (Lb) and upper bound (Ub).
As formulated in Equation (1), x denotes the decision variables and the attributes/features in a cyber-physical problem, specifically, a machine learning-based intrusion detection. When these attributes were statistically evaluated using the Kolmogorov–Smirnov (K–S) test, the outcomes revealed that the attributes follow nonlinear and non-Gaussian distributions. This indicates that the models must employ nonlinear equations to perfectly fit small variations of normal and anomalous behaviors [1].
To solve the nonlinear attributes/decision attributes of NESs in a machine learning-based intrusion detection problem, Equation (1) comprises n equations, while the optimization algorithms usually work to minimize only one. Therefore, Equation (1), which defines the NESs, was transformed into Equation (2) to become a minimization problem that could be solved using an optimization algorithm.
f ( x ) = i = 1 n f i 2 ( x )
This equation is considered as the objective function that needs to be minimized using optimization techniques to find the optimal roots and clear boundaries between the nonlinear attributes of normal and suspicious events.

2.2. Swarm and Evolutionary Algorithms

In [28], the social emotion optimization algorithm (EOA) was integrated with a metropolis rule as an attempt to escape the local minima that has been proposed for the NESs. This hybrid algorithm, abbreviated as MSEOA, was compared with the particle swarm algorithm (PSO) and the standard EOA to determine the best one for solving four nonlinear equations. In the experiments, MSEOA was found to be more effective for solving the NESs. Further, Wu Z. and L. Kang [29] proposed a parallel Elite-subspace evolutionary algorithm (PESEA) to solve the NESs in a reasonable time. PESEA was validated using five nonlinear equations to determine its punctuality in estimating their optimal roots. Based on the conducted experiments, PESEA is faster and more punctual.
To solve NESs, the authors of [30] suggested a hybrid approach that involved using the capability of chaos maps to dramatically explore the search space with a quasi-Newton method outstanding with high convergence. The authors of [31] integrated an evolutionary algorithm with additional strategies. The authors combined the k-means clustering method with niching to guide the optimization process to the multiple roots within the search space, and avoided getting stuck in local minima using the two methods. Finally, the authors proposed using various crowding factors to decrease the replacement error for finding the multiple roots of the NESs, and this algorithm was called a one-step k-means clustering-based differential evolution (KSDE). Following the development of KSDE, 30 problems have been used to validate its performance, in addition to comparing the algorithm with some of the state-of-the-art methods to show its superiority.
Rizk-Allah [32] proposed a new approach, namely Q-SCA, to solve NESs based on modifying the sine-cosine algorithm (SCA). Using Q-SCA, the Rizk-Allah dynamically adjusted the SCA’s search ability to search around the current location or the best-so-far solution to improve its exploitation capability as an attempt to accelerate the local convergence rate. Q-SCA also used the quantum local search (QLS) to improve the obtained solutions as an attempt to balance the algorithm’s exploration and exploitation capability. This approach was investigated among 12 NESs and 2 electrical applications and compared with several algorithms to show its stability and accuracy in achieving true better outcomes. The experimental outcomes showed the superiority of this algorithm over the standard one. The authors of [33,34,35] adapted various genetic algorithms (GAs) to solve the NESs.
The grasshopper optimization algorithm (GOA) [36] has been hybridized with the GA to produce a new hybrid algorithm known as hybrid GOA with GA for solving the NESs. This hybrid algorithm combined the merits of both GA and GOA to escape from the local minima and accelerate the convergence speed. More than that, the grey wolf optimizer (GWO) has been integrated with differential evolution to tackle the NESs; this algorithm was named GWO-DE. In [37], differential evolution improved and integrated with a restart strategy, namely DE-R, has been proposed for the NESs. DE-R used a new mutation operator and a restart technique to promote the exploration ability and avoid getting stuck into local minima. DE-R was compared with some recently developed algorithms over a set of nonlinear equation systems and real-world problems to show the effectiveness of DE-R.
Ultimately, several continuous evolutionary and swarm intelligence algorithms have been promoted that might be applied to tackle this problem in the future in the hope of finding better outcomes; some of those algorithms are natural evolution strategies [38], particle swarm optimization in the estimation of distribution algorithms (EDAs) framework [39], the EDAs [40], and the covariance matrix adaptation evolution strategy [41].

3. Standard Algorithm: Cuckoo Search Algorithm

Xin Shen Yang [26] proposed a new metaheuristic algorithm, namely the cuckoo search algorithm (CSA), for solving optimization problems. Recently, CSA was employed for selecting the most relevant nonlinear attributes and discovering suspicious observations [42]. This research is motivated to develop a new variant of CSA that can efficiently deal with nonlinear functions and will be effective in finding the clear bounds of legitimate and suspicious behaviors while implementing classification methods. CSA is inspired by the obligate brood parasitism of some cuckoo birds by laying their eggs in the nests of other host birds. Sometimes, when the cuckoos find out the eggs in their nests do not belong to them, those foreign eggs are either flung out or all the nests are abandoned. In general, the CS algorithm is based on three rules:
(1)
Each cuckoo lays one egg at a time and put its egg in a randomly chosen nest;
(2)
The best nests with eggs having high quality will be used in the next generation;
(3)
The available host nests number is fixed, and the cuckoos can discover a foreign egg with a probability p a that varies between 0 and 1.
CSA could balance the global random walk and local random walk to promote its searchability for reaching better outcomes. Mathematically, the global random walk is formulated as
x i t + 1 = x i t + α ( s , λ )
where t express the current iteration, x i t is the current position of the i t h cuckoo, x i t + 1 indicates the next position, ( s , λ ) is the levy distributions used to determine the size of the step of random walk,   s is the stepsize, and α is a positive scaling factor. The local random walk is defined as follows:
x i t + 1 = x i t + α s H ( p a ε ) ( x j t x k t )
where indicates the entry-wise multiplication operator, H is a heavy-side function, ε is a random number generated based on the normal distribution, and x j t   and   x k t are two random positions chosen randomly from the current population. t max indicates the maximum number of iterations. The steps of CSA are shown in Algorithm 1.
Algorithm 1 The steps of CSA
  • Create an initial population of N solutions.
  • Initialize α , p a , and t = 0 ;
  • Evaluate the fitness for each solution and determine the best-so-far solution x * .
  • while (t < t m a x )
  • Create a new population using Equation (3) and insert better ones into the current population.
  • t = t + 1;
  • Create a new population using Equation (4) and add into the current the best ones.
  • t = t + 1;
  • end while

4. Proposed Algorithm

In this section, the steps of the proposed algorithm, known as an improved cuckoo search algorithm (ICSA), will be clearly described; those steps are initialization, evaluations, and ICSA.

4.1. Initialization

At the outset of the optimization algorithm, a group of N solutions will be created with d dimensions for each, which are randomly initialized within the search space of the problem according to the following equation:
i N ,   x i = L + r ( U L )
where U     and   L are two vectors including the upper and lower bounds of various problem dimensions, and r is a vector of d elements assigned randomly between 0 and 1. After completing the initialization step, those initial solutions will be evaluated using Equation (2) to determine the quality of each one, and the one with the highest quality will be extracted to help later in improving the quality of the new populations.

4.2. Convergence Improvement Strategy (CIS)

A new strategy, called convergence improvement strategy, is proposed for improving the performance of the meta-heuristic algorithm to achieve better convergence, in addition to improving final accuracy, and enhancing the ability to select the most significant attributes for CPS problems. This strategy is two-fold: the first aspect is based on searching the best-so-far solutions for a better solution using Equation (6) to save time in the optimization process if the near-optimal solution is found around this best-so-far case, but this best-so-far solution also may be a trap to drift the algorithm into local minima, hence reducing the possibility of reaching better outcomes. Therefore, the second aspect, formulated mathematically in Equation (7), is used to avoid falling into local minima based on multiplying the current position in a vector v c generated randomly based on the uniform distribution with the lower endpoints 1 × r 1 and upper endpoint r 1 ; where r 1 is a value created randomly between 0 and 1.
x i t + 1 = x * + α ( x i t x * )
x i t + 1 = v c x i t
The swap between Equations (6) and (7) is determined based on a probability, namely γ, picked during the experiments by the researcher at the expense of their outcomes; this probability in our experiment was set to 0.1, after extensive experiments.

4.3. Improved Cuckoo Search Algorithm (ICSA)

To improve the global random walk of CSA, CIS is called after executing the global random walk, with probability pr to accelerate the convergence speed toward the best-so-far solution, using the first aspect, and to avoid getting stuck into local minima, using the second aspect. Generally, Algorithm 2 elaborates the steps of ICSA after integrating CIS. Before starting the optimization process by ICSA, N solutions will be randomly distributed within the search space to cover it as much as possible, in addition to initializing the main parameters of the ICSA. Then, those solutions will be updated by the global random walk integrated with the CIS with a probability pr set to 0.5, as explained in the experiments section, to promote its searchability for reaching better outcomes, as described in Lines 6–16 in Algorithm 2. In Line 19, the current solution will be updated using the local random walk as an attempt to avoid getting stuck into local minima. This optimization process is continuously running until the termination condition is satisfied (reaching the maximum iteration t m a x ).
Algorithm 2 The steps of ICSA
  • Create an initial population of N solutions.
  • Initialize α , p a , γ, and t = 0 ;
  • Evaluate the fitness for each solution and determine the best-so-far solution x * .
  • while (t < t m a x )
  • nX: Create a new population using Equation (3)
  • For (i = 1: N)
  • r : create a random number between 0 and 1.
  • if (r > pr)
  • r 1 : create a random number between 0 and 1.
  • if ( r 1 < γ )
  • Update the current solution n X i using Equation (6)
  • Else
  • Update the current solution n X i using Equation (7)
  • End if
  • End if
  • End for
  • Evaluate each solution in the new population and insert better ones into the current population.
  • t = t + 1;
  • Create a new population using Equation (4) and add into the current the best ones.
  • t = t + 1;
  • end while

5. Outcomes and Discussion

This section validates the performance of the proposed algorithm, ICSA, to examine its efficacy, in addition to witnessing its superiority compared to some well-established optimization algorithms under various statistical analyses. Best, average (Avg), worst, and standard deviation (SD) were obtained as the fitness values within 30 independent trials, and the Wilcoxon rank-sum test was used to determine significance. The compared algorithms used in our experiments included slime mould algorithm (SMA, 2020) [22], marine predators algorithm (MPA, 2020) [23], Bat algorithm (BA, 2012) [24], salp swarm algorithm (SSA, 2017) [25], standard cuckoo search algorithm (CSA, 2009) [26], and flower pollination algorithm (FPA, 2012) [27]. Algorithms were programmatically implemented using MATLAB R2019a based on the cited parameters under the same operating conditions as the proposed algorithm; those conditions are summarized as the maximum number of iterations, the population size, and the number of independent runs, which are respectively set to 500, 30, and 30. A computer with 32GB of RAM, Intel(R) Core(TM) i7-4700MQ CPU @ 2.40 GHz, and a 64-bit operating system (Windows 10) was used to conduct all the experiments.
To validate the performance of our proposed algorithm, 34 test cases of the nonlinear equation systems used widely in the literature were used. Most of these equations were widely used in the design of cybersecurity models, such as intrusion detection and threat models, to differentiate between small variations of normal and abnormal activities in CPSs. The characteristics of these functions are summarized as the number of dimensions (D), the search space (ℝ) for each dimension, and formulas to those functions, and their references are presented in Table 1.
To adjust the main effective parameters of the proposed algorithm, which include α, γ, and pr, extensive experiments have been performed with various values for each parameter on F12, and their outcomes for 30 independent trials are depicted in Figure 1. Inspecting this figure shows that the near-optimal values for α, γ, and pr were 0.5, 0.1, and 0.5, respectively. The value of the parameter pr was set to 0.5 instead of 0.6 because the algorithm was better able to minimize the objective value at this number.
In Table 2, the best, worst, and Avg objective values, in addition to SD, were obtained after running each algorithm 30 independent times, and test functions F1-F28 are exposed. From this table, on one side, ICSA had the best metric for 26 of 28 test cases, where the less possible value of 0 was reached for 19 out of those 26 test cases. This shows the superiority of our proposed algorithm in minimizing objective values in comparison with the other algorithms; for Avg, Worst, and SD measures, ICSA was best for 21 test cases, and this indicates the proposed algorithm is not stable since its outcomes were relatively diversified within all independent runs. This is our main limitation that needs to be addressed in future work.
Furthermore, the proposed algorithm was compared with the others regarding the convergence speed to see which algorithm quickly converged to the optimal solution. This can be used to select the most relevant features or fit normal and abnormal observations under multivariate distributions. The convergence curves based on the outcomes were obtained by various algorithms for 21 test cases randomly selected among the first 21 test cases, and these are depicted in Figure 2, Figure 3, Figure 4, Figure 5, Figure 6, Figure 7, Figure 8, Figure 9, Figure 10, Figure 11, Figure 12, Figure 13, Figure 14, Figure 15, Figure 16, Figure 17, Figure 18, Figure 19, Figure 20, Figure 21 and Figure 22. From those figures, we point out that the proposed algorithm reached a lower objective value faster than the others.
The Wilcoxon rank-sum test [55] was used to show the significance of the outcomes obtained by the proposed algorithm with each compared algorithm. Therefore, each algorithm was executed 30 independent times, and the outcomes were compared using a confidence level of 5% as significant. After that, the outcomes under this test are presented in Table 3. Inspecting this table shows that the proposed algorithm reached a P-value less than 0.05 for 22 test cases. This shows that the alternative hypothesis, which states there is a difference between the outcomes of ICSA and each compared algorithm, could be supported.
Additionally, the outcomes of the algorithms on test cases F29–F34 are shown in Table 4, which show the superiority of ICSA for F29, F30, F31, and F32 in terms of the best, avg, worst, and SD values. Only the best objective value could be better for the other two test cases. The convergence curves obtained by various algorithms for the same test cases are respectively presented in Figure 23 and Figure 24, which show that our proposed algorithm moved toward the optimal solution faster; hence, the number of function evaluations required for reaching the optimal solution will be significantly decreased compared to the other algorithms used in our comparison.
Last but not least, various algorithms in our experiments will be compared in terms of CPU time consumed by each one until completing the optimization process for each test case. For that, each algorithm was executed for 30 independent runs, and the consumed time within those runs on all test cases was calculated. Afterward, the rate of consumption on each test case was calculated by taking the average of the total consumed time and presented in Figure 25. This figure shows the superiority of SSA, which could occupy the first rank in terms of CPU time, while BA, FPA, MPA, CSA, and ICSA, respectively, came in second, third, fourth, fifth, and sixth. Although ICSA occupied the sixth rank in terms of the consumed time, its final accuracy and convergence speed make it a strong alternative for tackling the NESs, as it could reach better outcomes with a fewer number of function evaluations; hence, the consumed time will be minimized.
Ultimately, ICSA and the standard algorithm were separately compared with each other using a boxplot to analyze the efficacy of our improvement strategy. In general, the proposed algorithm and the standard one were independently executed 30 times, and the objective values obtained for 15 test cases are graphically pictured in Figure 26, Figure 27, Figure 28, Figure 29 and Figure 30. These figures show that ICSA was better for all used test cases except F4 and F12 depicted in Figure 27a and Figure 29c, where CSA could fulfill better outcomes. As a result, our improvement strategy could make a significant, positive effect on the performance of the standard algorithm for achieving better outcomes in fewer iterations and enhance the capability of finding small variances of legitimate and suspicious observations in the CPS domain, enhancing the performance of the machine learning-based intrusion detection techniques.
From the above, it is concluded that our modification to the standard CSA significantly improved its performance during solving the nonlinear equations system. This improvement was due to the searchability of the integrated method to avoid falling into local optima and accelerating the convergence speed in the right direction of the optimal solution. However, ICSA could not outperform some optimization algorithms, with the computational cost and stability as the limitations of our proposed algorithm, which will be addressed in future work by integrating the CIS with one of the several continuous evolutionary and swarm intelligence algorithms such as natural evolution strategies [38], the particle swarm optimization in the estimation of distribution algorithms (EDAs) framework [39], the EDAs [40], and the covariance matrix adaptation evolution strategy [41], which have not been yet applied to tackle the NESs.

6. Conclusions and Future Work

This paper has presented a new algorithm with strong merits to promote the searchability for solving the systems of nonlinear equations with a low number of function evaluations and fast convergence to the near-optimal solution. This is one of the challenges in the cyber physical domain, especially finding small variations between normal and abnormal behaviors of nonlinear attributes. This algorithm is based on integrating the cuckoo search algorithm with a novel strategy to produce a new variant, named the improved cuckoo search algorithm (ICSA), with high convergence speed and final accuracy in a small number of function evaluations. To assess the performance of ICSA, 34 well-known nonlinear equations systems were compared to see ICSA’s effectiveness in attacking the optimal solution for several function evaluations reaching 15,000 (multiplying the population size by the maximum number of iterations). ICSA also was extensively compared with the standard cuckoo search algorithm and five well-established algorithms—slime mould optimizer, marine predators algorithm, salp swarm algorithm, bat algorithm, and flower pollination algorithm—to affirm the superiority of ICSA. Experimental findings affirmed that ICSA could perform better for 32 test cases out of 34 in terms of the best objective value, while for the Avg, worst, and SD values it performed better for 25 test cases. This is considered as one of our main limitations to be processed in the future work, to preserve the stability of the algorithm within all runs for fulfilling the same outcomes. Additionally, the convergence curve and Wilcoxon rank-sum test were used to confirm the convergence speed and significance of our proposed algorithm, which affirmed that ICSA was better than several compared algorithms. In the future, we will integrate the proposed algorithm for developing a dynamic and wrapper feature selection algorithm that will assist in finding clear boundaries of legitimate and anomalous nonlinear attributes, improving the performance of identifying anomalous events while applying classification algorithms.

Author Contributions

Conceptualization, M.A.-B. and R.M.; methodology M.A.-B. and R.M.; software, M.A.-B. and R.M.; validation, N.M. (Nazeeruddin Mohammad), K.S. and N.M. (Nour Moustafa); formal analysis, M.A.-B. and R.M.; investigation, N.M. (Nazeeruddin Mohammad), K.S. and N.M. (Nour Moustafa); resources, M.A.-B. and R.M.; data curation, M.A.-B., R.M. and K.S.; writing—original draft preparation, M.A.-B. and R.M.; writing—review and editing, M.A.-B., R.M., N.M. (Nazeeruddin Mohammad), K.S. and N.M. (Nour Moustafa); visualization, M.A.-B., R.M., N.M. (Nazeeruddin Mohammad), K.S. and N.M. (Nour Moustafa); supervision, M.A.-B., N.M. (Nazeeruddin Mohammad), N.M. (Nour Moustafa); project administration, M.A.-B., N.M. (Nazeeruddin Mohammad), N.M. (Nour Moustafa); funding acquisition, N.M. (Nour Moustafa). All authors have read and agreed to the published version of the manuscript.

Funding

The authors would like to thank the PMU Cybersecurity Center and UNSW Canberra for supporting this research.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank the PMU Cybersecurity Center and UNSW Canberra for supporting this research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Moustafa, N.; Slay, J. The evaluation of Network Anomaly Detection Systems: Statistical analysis of the UNSW-NB15 data set and the comparison with the KDD99 data set. Inf. Secur. J. A Glob. Perspect. 2016, 25, 18–31. [Google Scholar] [CrossRef]
  2. Moustafa, N.; Slay, J.; Creech, G. Novel Geometric Area Analysis Technique for Anomaly Detection Using Trapezoidal Area Estimation on Large-Scale Networks. IEEE Trans. Big Data 2019, 5, 481–494. [Google Scholar] [CrossRef]
  3. Facchinei, F.; Kanzow, C. Generalized Nash Equilibrium Problems. Ann. Oper. Res. 2009, 175, 177–211. [Google Scholar] [CrossRef]
  4. Liao, Z.; Gong, W.; Wang, L. Memetic niching-based evolutionary algorithms for solving nonlinear equation system. Expert Syst. Appl. 2020, 149, 113261. [Google Scholar] [CrossRef]
  5. Darvishi, M.; Barati, A. A third-order Newton-type method to solve systems of nonlinear equations. Appl. Math. Comput. 2007, 187, 630–635. [Google Scholar] [CrossRef]
  6. Knoll, D.; Keyes, D. Jacobian-free Newton–Krylov methods: A survey of approaches and applications. J. Comput. Phys. 2004, 193, 357–397. [Google Scholar] [CrossRef] [Green Version]
  7. Abdel-Basset, M.; Chang, V.; Mohamed, R. A novel equilibrium optimization algorithm for multi-thresholding image segmentation problems. Neural Comput. Appl. 2020, 1–34. [Google Scholar] [CrossRef]
  8. Abdel-Basset, M.; Chang, V.; Mohamed, R. HSMA_WOA: A hybrid novel Slime mould algorithm with whale optimization algorithm for tackling the image segmentation problem of chest X-ray images. Appl. Soft Comput. 2020, 95, 106642. [Google Scholar] [CrossRef]
  9. Abdel-Basset, M.; El-Shahat, D.; Chakrabortty, R.K.; Ryan, M. Parameter estimation of photovoltaic models using an improved marine predators algorithm. Energy Convers. Manag. 2021, 227, 113491. [Google Scholar] [CrossRef]
  10. Abdel-Basset, M.; Mohamed, R.; Abouhawwash, M. Balanced multi-objective optimization algorithm using improvement based reference points approach. Swarm Evol. Comput. 2021, 60, 100791. [Google Scholar] [CrossRef]
  11. Allaoui, Mohcin, Belaïd Ahiod, and Mohamed El Yafrani. A hybrid crow search algorithm for solving the DNA fragment assembly problem. Expert Syst. Appl. 2018, 102, 44–56. [Google Scholar] [CrossRef]
  12. Abdel-Basset, M.; Mohamed, R.; Abouhawwash, M.; Chakrabortty, R.; Ryan, M. A Simple and Effective Approach for Tackling the Permutation Flow Shop Scheduling Problem. Mathematics 2021, 9, 270. [Google Scholar] [CrossRef]
  13. Abdel-Basset, M.; Mohamed, R.; Chakrabortty, R.K.; Sallam, K.; Ryan, M.J. An efficient teaching-learning-based optimization algorithm for parameters identification of photovoltaic models: Analysis and validations. Energy Convers. Manag. 2021, 227, 113614. [Google Scholar] [CrossRef]
  14. Abdel-Basset, M.; Mohamed, R.; Elhoseny, M.; Bashir, A.K.; Jolfaei, A.; Kumar, N. Energy-aware marine predators algorithm for task scheduling in IoT-based fog computing applications. IEEE Trans. Ind. Inform. 2020, 17, 5068–5076. [Google Scholar] [CrossRef]
  15. Abdel-Basset, M.; Mohamed, R.; Elhoseny, M.; Chakrabortty, R.K.; Ryan, M. A Hybrid COVID-19 Detection Model Using an Improved Marine Predators Algorithm and a Ranking-Based Diversity Reduction Strategy. IEEE Access 2020, 8, 79521–79540. [Google Scholar] [CrossRef]
  16. Abdel-Basset, M.; Mohamed, R.; Elhoseny, M.; Chakrabortty, R.K.; Ryan, M.J. An efficient heap-based optimization algorithm for parameters identification of proton exchange membrane fuel cells model: Analysis and case studies. Int. J. Hydrogen Energy 2021, 46, 11908–11925. [Google Scholar] [CrossRef]
  17. Abdel-Basset, M.; Mohamed, R.; Chakrabortty, R.K.; Ryan, M.J.; Mirjalili, S. An efficient binary slime mould algorithm integrated with a novel attacking-feeding strategy for feature selection. Comput. Ind. Eng. 2021, 153, 107078. [Google Scholar] [CrossRef]
  18. Abdel-Basset, M.; Mohamed, R.; Abouhawwash, M.; Chakrabortty, R.K.; Ryan, M.J. EA-MSCA: An effective energy-aware multi-objective modified sine-cosine algorithm for real-time task scheduling in multiprocessor systems: Methods and analysis. Expert Syst. Appl. 2021, 173, 114699. [Google Scholar] [CrossRef]
  19. Abdel-Basset, M.; Mohamed, R.; Mirjalili, S.; Chakrabortty, R.K.; Ryan, M.J. Solar photovoltaic parameter estimation using an improved equilibrium optimizer. Sol. Energy 2020, 209, 694–708. [Google Scholar] [CrossRef]
  20. Civicioglu, P.; Besdok, E. Bernstain-search differential evolution algorithm for numerical function optimization. Expert Syst. Appl. 2019, 138, 112831. [Google Scholar] [CrossRef]
  21. Keshk, M.; Sitnikova, E.; Moustafa, N.; Hu, J.; Khalil, I. An Integrated Framework for Privacy-Preserving Based Anomaly Detection for Cyber-Physical Systems. IEEE Trans. Sustain. Comput. 2019, 6, 66–79. [Google Scholar] [CrossRef]
  22. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  23. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar] [CrossRef]
  24. Yang, X.-S.; He, X. Bat algorithm: Literature review and applications. Int. J. Bio Inspired Comput. 2013, 5, 141–149. [Google Scholar] [CrossRef] [Green Version]
  25. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  26. Yang, X.-S.; Deb, S. Cuckoo search via Lévy flights. In Proceedings of the 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), Coimbatore, India, 9–11 December 2009. [Google Scholar]
  27. Yang, X.-S. Flower pollination algorithm for global optimization. In Proceedings of the International Conference on Unconventional Computing and Natural Computation, Orléans, France, 3–7 September 2012. [Google Scholar]
  28. Wu, J.; Cui, Z.; Liu, J. Using hybrid social emotional optimization algorithm with metropolis rule to solve nonlinear equations. In Proceedings of the IEEE 10th International Conference on Cognitive Informatics and Cognitive Computing (ICCI-CC’11), Banff, UK, 18–20 August 2011. [Google Scholar]
  29. Wu, Z.; Kang, L. A fast and elitist parallel evolutionary algorithm for solving systems of non-linear equations. In Proceedings of the 2003 Congress on Evolutionary Computation, 2003. CEC’03, Canberra, ACT, Australia, 8–12 December 2003. [Google Scholar]
  30. Luo, Y.-Z.; Tang, G.-J.; Zhou, L.-N. Hybrid approach for solving systems of nonlinear equations using chaos optimization and quasi-Newton method. Appl. Soft Comput. 2008, 8, 1068–1073. [Google Scholar] [CrossRef]
  31. Wu, J.; Gong, W.; Wang, L. A clustering-based differential evolution with different crowding factors for nonlinear equations system. Appl. Soft Comput. 2021, 98, 106733. [Google Scholar] [CrossRef]
  32. Rizk-Allah, R.M. A quantum-based sine cosine algorithm for solving general systems of nonlinear equations. Artif. Intell. Rev. 2021, 1–52. [Google Scholar] [CrossRef]
  33. Mangla, C.; Ahmad, M.; Uddin, M. Optimization of complex nonlinear systems using genetic algorithm. Int. J. Inf. Technol. 2020, 1–13. [Google Scholar] [CrossRef]
  34. Hassan, O.F.; Jamal, A.; Abdel-Khalek, S. Genetic algorithm and numerical methods for solving linear and nonlinear system of equations: A comparative study. J. Intell. Fuzzy Syst. 2020, 38, 2867–2872. [Google Scholar] [CrossRef]
  35. Jaiswal, S.; Kumar, C.S.; Seepana, M.M.; Babu, G.U.B. Design of Fractional Order PID Controller Using Genetic Algorithm Optimization Technique for Nonlinear System. Chem. Prod. Process. Model. 2020, 15. [Google Scholar] [CrossRef]
  36. El-Shorbagy, M.A.; El-Refaey, A.M. Hybridization of Grasshopper Optimization Algorithm with Genetic Algorithm for Solving System of Non-Linear Equations. IEEE Access 2020, 8, 220944–220961. [Google Scholar] [CrossRef]
  37. Wetweerapong, J.; Puphasuk, P. An improved differential evolution algorithm with a restart technique to solve systems of nonlinear equations. Int. J. Optim. Control. Theor. Appl. (IJOCTA) 2020, 10, 118–136. [Google Scholar] [CrossRef] [Green Version]
  38. Wierstra, D.; Schaul, T.; Peters, J.; Schmidhuber, J. Natural evolution strategies. In Proceedings of the 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), Hong Kong, China, 1–6 June 2008. [Google Scholar]
  39. Santucci, V.; Milani, A. Particle swarm optimization in the EDAs framework. In Soft Computing in Industrial Applications; Springer: Berlin/Heidelberg, Germany, 2011; pp. 87–96. [Google Scholar]
  40. Larrañaga, P.; Lozano, J.A. Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2001; Volume 2. [Google Scholar]
  41. Hansen, N. The CMA evolution strategy: A comparing review. Towards New Evol. Comput. 2006, 192, 75–102. [Google Scholar]
  42. Sarvari, S.; Sani, N.F.M.; Hanapi, Z.M.; Abdullah, M.T. An efficient anomaly intrusion detection method with feature selection and evolutionary neural network. IEEE Access 2020, 8, 70651–70663. [Google Scholar] [CrossRef]
  43. Song, W.; Wang, Y.; Li, H.-X.; Cai, Z. Locating multiple optimal solutions of nonlinear equation systems based on multiobjective optimization. IEEE Trans. Evol. Comput. 2015, 19, 414–431. [Google Scholar] [CrossRef]
  44. Grosan, C.; Abraham, A. A New Approach for Solving Nonlinear Equations Systems. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2008, 38, 698–714. [Google Scholar] [CrossRef]
  45. Pourjafari, E.; Mojallali, H. Solving nonlinear equations systems with a new approach based on invasive weed optimization algorithm and clustering. Swarm Evol. Comput. 2012, 4, 33–43. [Google Scholar] [CrossRef]
  46. Sacco, W.; Henderson, N. Finding all solutions of nonlinear systems using a hybrid metaheuristic with Fuzzy Clustering Means. Appl. Soft Comput. 2011, 11, 5424–5432. [Google Scholar] [CrossRef]
  47. Hirsch, M.J.; Pardalos, P.M.; Resende, M.G. Solving systems of nonlinear equations with continuous GRASP. Nonlinear Anal. Real World Appl. 2009, 10, 2000–2006. [Google Scholar] [CrossRef]
  48. Sharma, J.R.; Arora, H. On efficient weighted-Newton methods for solving systems of nonlinear equations. Appl. Math. Comput. 2013, 222, 497–506. [Google Scholar] [CrossRef]
  49. Ingber, L.; Petraglia, A.; Petraglia, M.R. Adaptive simulated annealing. In Stochastic Global Optimization and Its Applications with Fuzzy Adaptive Simulated Annealing; Springer: Berlin/Heidelberg, Germany, 2012; pp. 33–62. [Google Scholar]
  50. Morgan, A.; Shapiro, V. Box-bisection for solving second-degree systems and the problem of clustering. ACM Trans. Math. Softw. 1987, 13, 152–167. [Google Scholar] [CrossRef]
  51. Grau-Sánchez, M.; Grau, À.; Noguera, M. Frozen divided difference scheme for solving systems of nonlinear equations. J. Comput. Appl. Math. 2011, 235, 1739–1743. [Google Scholar] [CrossRef]
  52. Hueso, J.L.; Martínez, E.; Torregrosa, J.R. Modified Newton’s method for systems of nonlinear equations with singular Jacobian. J. Comput. Appl. Math. 2009, 224, 77–83. [Google Scholar] [CrossRef] [Green Version]
  53. Waziri, M.; Leong, W.J.; Hassan, M.A.; Monsi, M. An efficient solver for systems of nonlinear equations with singular Jacobian via diagonal updating. Appl. Math. Sci. 2010, 4, 3403–3412. [Google Scholar]
  54. Turgut, O.E.; Turgut, M.S.; Coban, M.T. Chaotic quantum behaved particle swarm optimization algorithm for solving nonlinear system of equations. Comput. Math. Appl. 2014, 68, 508–530. [Google Scholar] [CrossRef]
  55. Kasuya, E. Mann-Whitney U-test when variances are unequal. Anim. Behav. 2001, 61, 1247–1249. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The parameter tunning for the proposed algorithm.
Figure 1. The parameter tunning for the proposed algorithm.
Mathematics 09 01140 g001
Figure 2. Convergence curve for F1.
Figure 2. Convergence curve for F1.
Mathematics 09 01140 g002
Figure 3. Convergence curve for F2.
Figure 3. Convergence curve for F2.
Mathematics 09 01140 g003
Figure 4. Convergence curve for F3.
Figure 4. Convergence curve for F3.
Mathematics 09 01140 g004
Figure 5. Convergence curve for F4.
Figure 5. Convergence curve for F4.
Mathematics 09 01140 g005
Figure 6. Convergence curve for F5.
Figure 6. Convergence curve for F5.
Mathematics 09 01140 g006
Figure 7. Convergence curve for F7.
Figure 7. Convergence curve for F7.
Mathematics 09 01140 g007
Figure 8. Convergence curve for F8.
Figure 8. Convergence curve for F8.
Mathematics 09 01140 g008
Figure 9. Convergence curve for F9.
Figure 9. Convergence curve for F9.
Mathematics 09 01140 g009
Figure 10. Convergence curve for F10.
Figure 10. Convergence curve for F10.
Mathematics 09 01140 g010
Figure 11. Convergence curve for F11.
Figure 11. Convergence curve for F11.
Mathematics 09 01140 g011
Figure 12. Convergence curve for F12.
Figure 12. Convergence curve for F12.
Mathematics 09 01140 g012
Figure 13. Convergence curve for F13.
Figure 13. Convergence curve for F13.
Mathematics 09 01140 g013
Figure 14. Convergence curve for F14.
Figure 14. Convergence curve for F14.
Mathematics 09 01140 g014
Figure 15. Convergence curve for F15.
Figure 15. Convergence curve for F15.
Mathematics 09 01140 g015
Figure 16. Convergence curve for F16.
Figure 16. Convergence curve for F16.
Mathematics 09 01140 g016
Figure 17. Convergence curve for F17.
Figure 17. Convergence curve for F17.
Mathematics 09 01140 g017
Figure 18. Convergence curve for F18.
Figure 18. Convergence curve for F18.
Mathematics 09 01140 g018
Figure 19. Convergence curve for F19.
Figure 19. Convergence curve for F19.
Mathematics 09 01140 g019
Figure 20. Convergence curve for F20.
Figure 20. Convergence curve for F20.
Mathematics 09 01140 g020
Figure 21. Convergence curve for F21.
Figure 21. Convergence curve for F21.
Mathematics 09 01140 g021
Figure 22. Convergence curve for F28.
Figure 22. Convergence curve for F28.
Mathematics 09 01140 g022
Figure 23. Convergence curve of the algorihms for various test cases: (a) Convergence curve for F29; (b) Convergence curve for F30; (c) Convergence curve for F31.
Figure 23. Convergence curve of the algorihms for various test cases: (a) Convergence curve for F29; (b) Convergence curve for F30; (c) Convergence curve for F31.
Mathematics 09 01140 g023
Figure 24. Convergence curve of the algorihms for various test cases: (a) Convergence curve for F32; (b) Convergence curve for F33; (c) Convergence curve for F34.
Figure 24. Convergence curve of the algorihms for various test cases: (a) Convergence curve for F32; (b) Convergence curve for F33; (c) Convergence curve for F34.
Mathematics 09 01140 g024
Figure 25. Comparison among algorithms in terms of CPU time.
Figure 25. Comparison among algorithms in terms of CPU time.
Mathematics 09 01140 g025
Figure 26. Boxplot of CSA and ICSA for various test cases: (a) Boxplot for F1; (b) Boxplot for F2; (c) Boxplot for F3.
Figure 26. Boxplot of CSA and ICSA for various test cases: (a) Boxplot for F1; (b) Boxplot for F2; (c) Boxplot for F3.
Mathematics 09 01140 g026
Figure 27. Boxplot of CSA and ICSA for various test cases: (a) Boxplot for F4; (b) Boxplot for F5; (c) Boxplot for F6.
Figure 27. Boxplot of CSA and ICSA for various test cases: (a) Boxplot for F4; (b) Boxplot for F5; (c) Boxplot for F6.
Mathematics 09 01140 g027
Figure 28. Boxplot of CSA and ICSA for various test cases: (a) Boxplot for F7; (b) Boxplot for F8; (c) Boxplot for F9.
Figure 28. Boxplot of CSA and ICSA for various test cases: (a) Boxplot for F7; (b) Boxplot for F8; (c) Boxplot for F9.
Mathematics 09 01140 g028
Figure 29. Boxplot of CSA and ICSA for various test cases: (a) Boxplot for F10; (b) Boxplot for F11; (c) Boxplot for F12.
Figure 29. Boxplot of CSA and ICSA for various test cases: (a) Boxplot for F10; (b) Boxplot for F11; (c) Boxplot for F12.
Mathematics 09 01140 g029
Figure 30. Boxplot of CSA and ICSA for various test cases: (a) Boxplot for F13; (b) Boxplot for F14; (c) Boxplot for F7.
Figure 30. Boxplot of CSA and ICSA for various test cases: (a) Boxplot for F13; (b) Boxplot for F14; (c) Boxplot for F7.
Mathematics 09 01140 g030
Table 1. Descriptions of the nonlinear equation systems used in our experiments.
Table 1. Descriptions of the nonlinear equation systems used in our experiments.
FunctionFormulasD References
F1 x 1 sin ( 5 π x 2 ) = 0
x 1 x 2 = 0
2 x i = [ 1 , 1 ] i = 1 ,   2 [43]
F2 x 1 cos ( 4 π x 2 ) = 0
x 1 2 + x 2 2 1 = 0
2 x i = [ 10 , 10 ] i = 1 ,   2 [43]
F3 x 1 0.25428722 0.18324757 x 4 x 3 x 9 = 0
x 2 0.37842197 0.16275449 x 1 x 10 x 6 = 0
x 3 0.27162577 0.16955071 x 1 x 2 x 10 = 0
x 4 0.19807914 0.15585316 x 7 x 1 x 6 = 0
x 5 0.44166728 0.19950920 x 7 x 6 x 3 = 0
x 6 0.14654113 0.18922793 x 8 x 5 x 10 = 0
x 7 0.42937161 0..21180486 x 2 x 5 x 8 = 0
x 8 0.07056438 0.17081208 x 1 x 7 x 6 = 0
x 9 0.34504906 0.19612740 x 10 x 6 x 8 = 0
x 10 0.42651102 0..21466544 x 4 x 8 x 1 = 0
10 x i = [ 10 , 10 ] i = 1 ,     , 10 [44]
F4 3.0 x 1 x 3 2 = 0
x 3 sin ( π x 2 ) x 3 x 4 = 0
x 2 x 3 exp ( 1.0 x 1 x 3 ) + 0.2707 = 0
2 x 1 2 x 3 x 2 4 x 3 x 2 = 0
4 x i = [ 0 ,   5 ] i = 1 ,   4 [45]
F5 4 x 1 3 + 4 x 1 x 2 + 2 x 2 2 42 x 1 14 = 0
4 x 2 3 + 2 x 1 2 + 4 x 1 x 2 16 x 2 22 = 0
2 x i = [ 20 , 20 ] i = 1 ,   2 [46]
F6 sin ( x 1 ) cos ( x 2 ) 2 cos ( x 1 ) sin ( x 2 ) = 0
cos ( x 1 ) sin ( x 2 ) 2 sin ( x 1 ) cos ( x 2 ) = 0
2 x i = [ 0 , π ] i = 1 ,   2 [47]
F7 x 1 2 + x 2 2 1.0 = 0
x 3 2 + x 4 2 1.0 = 0
x 5 2 + x 6 2 1.0 = 0
x 7 2 + x 8 2 1.0 = 0
4.731   ·   10 3   x 1 x 3 0.3578 x 2 x 3 0.1238 x 1 + x 7 1.637   ·   10 3 x 2 0.9338 x 4 0.3571 = 0
0.2238 x 1 x 3 + 0.7623 x 2 x 3 + 0.2638 x 1 x 7 0.07745 x 2 0.6734 x 4 0.6022 = 0
x 6 x 8 + 0.3578 x 1 + 4.731   ·   10 3 x 2 = 0
0.7623 x 1 + 0.2238 x 2 + 0.3461 = 0
8 x i = [ 1 , 1 ] i = 1 ,   , 8 [4]
F8 x i cos ( 2 x i j = 1 D x j ) = 0 3 x i = [ 20 , 20 ] i = 1 , ,   D [48]
F9 x 1 2 x 2 2 = 0
x 1 + sin ( π 2 x 2 ) = 0
2 x 1 = [ 0 ,   1 ]
x 2 = [ 10 ,   0 ]
[45]
F10 x 1 2 + x 2 2 + x 1 + x 2 8 = 0
x 1 | x 2 | + x 1 + | x 2 | 5 = 0
2 x 1 = [ 30 ,   30 ]
x 2 = [ 30 ,   30 ]
[49]
F11 x 1 2 | x 2 | + 1 + 1 9   | x 1 1 | = 0
x 2 2 + 5 x 1 2 7 + 1 9   | x 2 | = 0
2 x 1 = [ 1 ,   1 ]
x 2 = [ 10 ,   10 ]
[49]
F12 i = 1 D x i 2 1 = 0
| x 1 x 2 | + i = 3 D x 1 2 = 0
20 x i = [ 1 , 1 ] i = 1 , ,   D [43]
F13 2 x 1 +   x 2 +   x 3 +   x 4 +   x 5 6.0 = 0
x 1 + 2 x 2 +   x 3 +   x 4 +   x 5 6.0 = 0
x 1 +   x 2 + 2 x 3 +   x 4 +   x 5 6.0 = 0
x 1 +   x 2 +   x 3 + 2 x 4 +   x 5 6.0 = 0
x 1   x 2 x 3   x 4 x 5 1.0 = 0
5 x i = [ 2 , 2 ] i = 1 , ,   D [50]
F14 x 1 2 x 1 x 2 2 x 2 + x 3 2 = 0
sin   ( x 2 exp   ( x 1 ) ) = 0
x 3 l o g   | x 2 | = 0
5 x 1 = [ 0 ,   2 ]
x 2 = [ 10 ,   10 ]
x 3 = [ 1 ,   1 ]
[51]
F15 cos ( x 2 ) sin ( x 1 ) = 0
x 3 x 1 1 x 2 = 0
exp   ( x 1 ) x 3 2 = 0
3 x 1 = [ 0 ,   5 ]
x 2 = [ 0 ,   5 ]
x 3 = [ 0 ,   5 ]
[4]
F16 ( x 1 1 ) 4   exp ( x 2 ) = 0
( x 2 2 ) 5 ( x 1 x 2 1 ) = 0
( x 3 + 4 ) 6 = 0
3 x 1 = [ 5 ,   5 ]
x 2 = [ 5 ,   5 ]
x 3 = [ 5 ,   5 ]
[52]
F17 exp ( x 1 2 ) 8 x 1 = 0
x 1 +   x 2 1 = 0
( x 3 1 ) 3 = 0
3 x 1 = [ 5 ,   5 ]
x 2 = [ 5 ,   5 ]
x 3 = [ 5 ,   5 ]
[52]
F18 x 1 3 x 1 x 2 x 3 = 0
x 2 2 x 1 x 3 = 0
10 x 1 x 2 x 3 x 1 0.1 = 0
3 x 1 = [ 5 ,   5 ]
x 2 = [ 5 ,   5 ]
x 3 = [ 5 ,   5 ]
[53]
F19 sin ( x 1 3 ) 3 x 1 x 2 2 1 = 0
cos ( 3 x 1 2 x 2 ) | x 2 3 | + 1 = 0
2 x 1 = [ 2 ,   2 ]
x 2 = [ 2 ,   2 ]
[4]
F20 4 x 1 3 3 x 1 cos ( x 2 ) = 0
sin ( x 1 2 ) | x 2 | = 0
2 x 1 = [ 2 ,   2 ]
x 2 = [ 2 ,   2 ]
[4]
F21 exp ( x 1 2 + x 2 2 ) 3 = 0
| x 2 | + x 1 + x 2 2   sin ( 3 | x 2 | + x 1 ) = 0
2 x 1 = [ 2 ,   2 ]
x 2 = [ 2 ,   2 ]
[4]
F22 3.84 x 1 2 + 3.84 x 1 x 2 = 0
3.84 x 2 2 + 3.84 x 2 x 3 = 0
3.84 x 3 2 + 3.84 x 3 x 1 = 0
3 x 1 = [ 0 ,   10 ]
x 2 = [ 0 ,   10 ]
x 3 = [ 0 ,   1 ]
[4]
F23 x 1 4 + x 2 4 x 1 x 2 3 6 = 0
| 1 x 1 2 x 2 2 | 0.6787 = 0
2 x 1 = [ 20 ,   20 ]
x 2 = [ 20 ,   20 ]
[4]
F24 0.5 x 1 2 + 0.5 x 2 2 + x 1 + x 2 8 = 0
| x 1 | x 2 + x 1 + | x 2 | x 1 = 5 = 0
2 x 1 = [ 5 ,   5 ]
x 2 = [ 5 ,   5 ]
[4]
F25 4   sin ( 4 x 1 ) x 2 = 0
x 1 2 + x 2 2 15 = 0
2 x 1 = [ 20 ,   20 ]
x 2 = [ 20 ,   20 ]
[4]
F26 cos ( 2 x 1 ) cos ( 2 x 2 ) 0.4 = 0
2 ( x 2 x 1 ) + sin ( 2 x 2 ) sin ( 2 x 1 ) 1.2 = 0
2 x 1 = [ 15 ,   15 ]
x 2 = [ 15 ,   15 ]
[4]
F27 x 1 + 0.5 x 2 2 5 = 0
x 1 + 5   sin ( π x 2 2 ) = 0
2 x 1 = [ 5 ,   5 ]
x 2 = [ 5 ,   5 ]
[4]
F28 x 1 2 + x 2 2 1 = 0
20 x 1 2 x 2 + 2 x 2 5 + 1   = 0
2 x 1 = [ 5 ,   5 ]
x 2 = [ 5 ,   5 ]
[4]
F29 x 1 x 2 + x 2 x 1 5   x 1   x 2   x 3 85 = 0
x 1 3 x 2 x 3 x 3 x 2 60   = 0
x 1 x 3 x 3 x 1 x 2 2   = 0
3 x 1 = [ 3 ,   5 ]
x 2 = [ 2 ,   4 ]
x 2 = [ 0.5 ,   2 ]
[54]
F30 x 1 3 3 x 1   x 2 2 1 = 0
3 x 1 2   x 2 x 2 3 + 1 = 0
2 x 1 = [ 10 ,   10 ]
x 2 = [ 10 ,   10 ]
[54]
F31 x 1 2 + x 3 2 1 = 0
x 2 2 + x 4 2 1 = 0
x 5   x 3 3 + x 6   x 4 3 = 0
x 5   x 1 3 + x 6   x 2 3 = 0
x 5   x 1 x 3 2 + x 6   x 2 x 4 2 = 0
x 5   x 3 x 1 2 + x 6   x 4 x 2 2 = 0
6 x i = [ 10 , 10 ] i = 1 , ,   D [54]
F32 0.5 sin   (   x 1   x 2 ) 0.25   x 2 π 0.5 x 1 = 0
( 1 0.25 π ) ( exp   ( 2 x 1 ) e ) + e x 2 π 2 e x 1 = 0
2 x 1 = [ 0.25 ,   1 ]
x 2 = [ 1.5 ,   2 π ]
[54]
F33 3 x 1 cos   ( x 2   x 3 ) 0.5 = 0
x 1 2 625   x 2 2 0.25 = 0
exp ( x 1 x 2 ) + 20 x 3 + ( 10 π 3 ) / 3 = 0
2 x i = [ 10 , 10 ] i = 1 , ,   D [52]
F34 x 1 + 0.25   x 2 2 x 4 x 6 + 0.75 = 0
x 2 + 0.405 exp ( 1 + x 1   x 2 ) 1.405 = 0
x 3 0.5   x 4   x 6 + 1.5 = 0
x 4 0.605 exp ( 1 x 3 2 ) 0.395 = 0
x 5 0.5   x 2   x 6 + 1.5 = 0
x 6 x 1   x 5 = 0
5 x i = [ 2 , 2 ] i = 1 , ,   D [54]
Table 2. Comparison among algorithms on test cases F1–F28.
Table 2. Comparison among algorithms on test cases F1–F28.
F ICSABAFPACSAMPASSASMA ICSABAFPACSAMPASSASMA
F1Best03 × 10102 × 1097 × 10142 × 10221 × 10170F151 × 10325 × 1085 × 1052 × 106;7 × 10191 × 10143 × 106;
-Avg03 × 1097 × 1083 × 10116 × 1072 × 10152 × 10305-1 × 1029 × 10+027 × 1034 × 1051 × 1042 × 10123 × 104
-Worst 01 × 1083 × 1072 × 10106 × 106;1 × 10145 × 10304-6 × 1022 × 10+047 × 1023 × 1043 × 1032 × 10117 × 103
-SD03 × 1099 × 1085 × 10111 × 106;3 × 10150-2 × 1024 × 10+031 × 1026 × 1055 × 1044 × 10121 × 103
F2Best01 × 10105 × 1096 × 10141 × 1081 × 10192 × 1018F1601 × 10436 × 10441 × 10933 × 10407 × 10572 × 1041
-Avg5 × 10326 × 1025 × 1073 × 10118 × 106;8 × 10142 × 109-9 × 10142 × 1032 × 10325 × 10697 × 10229 × 10493 × 1032
-Worst 3 × 10317 × 1012 × 106;2 × 10108 × 1055 × 10133 × 108-2 × 10125 × 1026 × 10312 × 10671 × 10207 × 10486 × 1031
-SD1 × 10312 × 1015 × 1074 × 10112 × 1051 × 10137 × 109-4 × 10139 × 1031 × 10313 × 10683 × 10212 × 10481 × 1031
F3Best6 × 10118 × 1074 × 1039 × 10104 × 106;1 × 10132 × 104F1709 × 1092 × 10116 × 10245 × 1091 × 10135 × 1012
-Avg5 × 1082 × 106;1 × 1025 × 1096 × 1054 × 10132 × 103-2 × 10312 × 1049 × 1077 × 10185 × 106;3 × 1056 × 109
-Worst 6 × 1072 × 1053 × 1021 × 1082 × 1047 × 10137 × 103-3 × 10303 × 1038 × 106;2 × 1016;4 × 1054 × 1041 × 107
-SD1 × 1073 × 106;6 × 1033 × 1096 × 1051 × 10132 × 103-8 × 10315 × 1042 × 106;3 × 10171 × 1057 × 1053 × 108
F4Best3 × 10219 × 1042 × 1046 × 1091 × 1033 × 106;4.00F1805 × 10106 × 1089 × 10127 × 1084 × 10121 × 106;
-Avg6 × 1024.002 × 1025 × 1043 × 1026 × 1024.00-2 × 1072 × 1049 × 1078 × 1084 × 1053 × 1054 × 106;
-Worst 4 × 1011 × 109 × 1026 × 1033 × 1012 × 1014.00-1 × 106;4 × 1033 × 106;4 × 1072 × 1042 × 1049 × 105
-SD1 × 1015.002 × 1021 × 1035 × 1028 × 1027 × 102-4 × 1078 × 1047 × 1071 × 1076 × 1055 × 1052 × 105
F5Best02 × 1084 × 106;1 × 10112 × 106;2 × 10122 × 1010F1901 × 106;8 × 10106 × 10225 × 10114 × 10134 × 1012
-Avg4 × 10296 × 1072 × 1045 × 1094 × 1031 × 10105 × 108-1 × 10321 × 1014 × 1075 × 10192 × 106;2 × 1051 × 108
-Worst 8 × 10283 × 106;6 × 1045.×1088 × 1029.× 10104 × 107-4 × 10313.007 × 106;1 × 10173 × 1052 × 1044 × 107
-SD2 × 10288 × 1072 × 1041 × 1081 × 1022 × 10101 × 107-8 × 10326 × 1011 × 106;2 × 10186 × 106;4 × 1057 × 108
F6Best0000000F2003 × 10127 × 10112 × 1016;1 × 10105 × 10184 × 1013
-Avg03 × 10312 × 10321 × 1032000-06 × 10101 × 1083 × 10133 × 1075 × 1016;5 × 1010
-Worst 01 × 10303 × 10313 × 1031000-06 × 1091 × 1073 × 10122 × 106;2 × 10155.×109
-SD04 × 10318 × 10325 × 1032000-01 × 1092 × 1086 × 10135 × 1075 × 1016;1 × 109
F7Best2 × 10157 × 1074 × 1034 × 1051 × 106;8 × 10149 × 1013F2109 × 10112 × 1083 × 10158 × 10103 × 1016;2 × 1011
-Avg4× 1051 × 1022 × 1022 × 1046 × 1047 × 1034×105-3 × 10325 × 1092 × 106;1 × 10114 × 106;7 × 10158 × 109
-Worst 5× 1042 × 1014 × 1027 × 1041 × 1022 × 1015×104-2 × 10313 × 1081 × 1051 × 10105 × 1053 × 10141 × 107
-SD1 × 1045 × 1029 × 1031 × 1042 × 1034 × 1021×104-6 × 10325 × 1093 × 106;2 × 10119 × 106;7 × 10152 × 108
F8Best02 × 1092 × 1072 × 10113 × 1075 × 10153 × 109F220000000
-Avg7 × 10335,001 × 1059 × 10101 × 1048 × 10139 × 108-05 × 10900000
-Worst 6 × 10325 × 108 × 1056 × 1091 × 1034 × 10121 × 106;-04 × 10800000
-SD1 × 10321 × 101 × 1051 × 1092 × 1041 × 10122 × 107-09 × 10900000
F9Best02 × 10133 × 10152 × 10312 × 10276 × 10193 × 1016;F2302 × 10102 × 1071 × 10129 × 1083 × 10144 × 1011
-Avg03 × 10107 × 10111 × 10187 × 1076 × 1016;2 × 1011-1 × 10312 × 1084 × 106;2 × 1092 × 1042 × 10124 × 108
-Worst 08 × 10101 × 1092 × 10172 × 1053 × 10154 × 1010-8 × 10311 × 1072 × 1052 × 1082 × 1031 × 10114 × 107
-SD02 × 10103 × 10104 × 10184 × 106;8 × 1016;8 × 1011-3 × 10312 × 1085 × 106;4. × 1094 × 1043 × 10121 × 107
F10Best01 × 10105 × 1085 × 10152 × 1071 × 10143 × 1011F2405 × 10121 × 1081 × 10133 × 1081 × 1016;1 × 1010
-Avg4 × 10305.003 × 106;9 × 10123 × 1042 × 10123 × 108-5 × 10318 × 1027 × 106;2 × 10102 × 1051 × 10136 × 102
-Worst 1 × 10287 × 108 × 106;5 × 10112 × 1037 × 10124 × 107-3 × 10302.005 × 1051 × 1092 × 1045 × 10139 × 101
-SD2 × 10292 × 102 × 106;1 × 10116 × 1042 × 10127 × 108-1 × 10305 × 1011 × 1054 × 10105 × 1051 × 10132 × 101
F11Best3 × 10322 × 10104 × 1072 × 10135 × 10225 × 10172 × 1010F2501 × 1094 × 1073 × 10116 × 10211 × 10133 × 1010
-Avg1 × 10314 × 1099 × 106;3 × 10115 × 1053 × 10143 × 108-7 × 1034 × 1022 × 1058 × 1097 × 1037 × 1031 × 102
-Worst 2 × 10312 × 1083 × 1055 × 10103 × 1042 × 10131 × 107-1 × 1011 × 1011 × 1044 × 1081 × 1011 × 1011 × 101
-SD1 × 10313 × 1091 × 1059 × 10119 × 1054 × 10143 × 108-3 × 1025 × 1023 × 1051 × 1083 × 1023 × 1023 × 102
F12Best8 × 106;6 × 106;2 × 1021 × 1054 × 1053 × 1051 × 1012F2602 × 10114 × 1072 × 10114 × 1084 × 10154 × 1011
-Avg2 × 1037 × 1041 × 1015 × 1052 × 1027 × 1054 × 109-4 × 10312 × 1091 × 1051 × 1082 × 1032 × 1037 × 104
-Worst 3 × 1021 × 1022 × 1019 × 1053 × 1012 × 1043 × 108-3 × 10308 × 1091 × 1048 × 1087 × 1037 × 1037 × 103
-SD6 × 1032 × 1035 × 1023 × 1056 × 1023 × 1057 × 109-9 × 10312 × 1092 × 1052 × 1083 × 1033 × 1032 × 103
F13Best3× 10145 × 1084 × 106;2 × 1098 × 106;2 × 1083 × 107F2703 × 10112 × 1085 × 10181 × 10105 × 1016;2 × 1012
-Avg9× 10131 × 1012 × 1052 × 1081 × 1035 × 1058 × 105-2 × 10311 × 1011 × 106;2 × 10118 × 106;2 × 10143 × 108
-Worst 4× 10123.006 × 1051 × 1079 × 1033 × 1044 × 104-2 × 10301.009 × 106;4 × 10109 × 1051 × 10131 × 107
-SD1 × 10126 × 1011 × 1052 × 1082 × 1038 × 1051 × 104-5 × 10314 × 1012 × 106;8 × 10112 × 1053 × 10144 × 108
F14Best2 × 10325 × 1093 × 106;5 × 10141 × 1083 × 10142 × 109F2802 × 1093 × 1083 × 10154 × 1081 × 1016;3 × 1011
-Avg2 × 10323 × 1082 × 1052 × 10102 × 1041 × 1048 × 107-2 × 1023 × 1024 × 106;2 × 10101 × 1057 × 1031 × 102
-Worst 5 × 10328 × 1085 × 1054 × 1092 × 1033 × 1031 × 105-5 × 1025 × 1024 × 1054 × 1091 × 1045 × 1025 × 102
-SD8 × 10332 × 1081 × 1057 × 10104 × 1045 × 1043 × 106;-3 × 1023 × 1027 × 106;8 × 10102 × 1052 × 1022 × 102
Bold values indicate the best outcomes.
Table 3. Comparison under the Wilcoxon rank-sum test.
Table 3. Comparison under the Wilcoxon rank-sum test.
FBAFPACSAMPASSASMA BAFPACSAMPASSASMA
P-valuehP-valuehP-valuehP-valuehP-valuehP-valuehFP-valuehP-valuehP-valuehP-valuehP-valuehP-valueh
F11 × 101211 × 101211 × 101211 × 101211 × 101214 × 1021 F15 3 × 10105 × 10101 × 10511 × 10511 × 106;14 × 1051
F22 × 101112 × 101112 × 101112 × 101112 × 101112 × 10111 F16 9 × 101013 × 10913 × 10913 × 10913 × 10913 × 1091
F33 × 101113 × 101111 × 10103 × 101113 × 101113 × 10111 F17 9 × 101219 × 101219 × 101219 × 101219 × 101219 × 10121
F41 × 106;16 × 10204 × 10101 × 10211 × 10213 × 10111 F18 7 × 10812 × 106;12 × 10215 × 10914 × 10914 × 10111
F54 × 101214 × 101214 × 101214 × 101214 × 101214 × 10121 F19 1 × 101111 × 101111 × 101111 × 101111 × 101111 × 10111
F63 × 10712 × 10103 × 1010NaN0NaN0NaN0F201 × 101211 × 101211 × 101211 × 101211 × 101211 × 10121
F76 × 101013 × 101114 × 101014 × 101012 × 10312 × 10101 F21 2 × 101112 × 101112 × 101112 × 101112 × 101112 × 10111
F82 × 101112 × 101112 × 101112 × 101112 × 101112 × 10111 F22 1 × 1041NaN0NaN0NaN0NaN0NaN0
F91 × 101211 × 101211 × 101211 × 101211 × 101211 × 10121 F23 2 × 101112 × 101112 × 101112 × 101112 × 101112 × 10111
F101 × 101111 × 101111 × 101111 × 101111 × 101111 × 10111 F24 9 × 101219 × 101219 × 101219 × 101219 × 101219 × 10121
F111 × 101111 × 101111 × 101111 × 101111 × 101111 × 10111 F25 8 × 101016 × 10916 × 10914 × 10914 × 10913 × 1091
F126 × 10314 × 101118 × 106;19 × 10202 × 10413 × 10111 F26 8 × 101218 × 101218 × 101218 × 101218 × 101218 × 10121
F131 × 10817 × 10811 × 10712 × 10818 × 10815 × 1081 F27 1 × 101111 × 101111 × 101111 × 101111 × 101111 × 10111
F149 × 101219 × 101219 × 101219 × 101219 × 101219 × 10121 F28 4 × 106;14 × 10104 × 10104 × 10109 × 10201 × 1021
Table 4. Comparison among algorithms based on the objective values for test cases F29–F34.
Table 4. Comparison among algorithms based on the objective values for test cases F29–F34.
F ICSABAFPACSAMPASSASMAFICSABAFPACSAMPASSASMA
F29Best02 × 1071 × 1083 × 10175 × 1022 × 1022 × 106;F32Best03 × 10122 × 10111 × 10224 × 1098 × 1017
-Avg2 × 10272 × 1054 × 1071 × 10142.005 × 1012 × 104-Avg5 × 10344 × 1043 × 1091 × 1016;2 × 106;5 × 1014
-Worst 5 × 1026;5 × 1053 × 106;1 × 10136.003.002 × 103-Worst 3 × 10331 × 1021 × 1087 × 1016;1 × 1052 × 1013
-SD1 × 1026;1 × 1057 × 1072 × 10141.005 × 1015 × 104-SD6 × 10342 × 1034 × 1092 × 1016;3 × 106;8 × 1014
F30Best02 × 10112 × 1097 × 10194 × 10111 × 1016;9 × 1013F33Best1 × 10172 × 10+024 × 106.002 × 10+029 × 10+02
-Avg1 × 10315 × 1094 × 1072 × 1016;4 × 1071 × 10143 × 109-Avg1 × 10+049 × 10+053 × 10+032 × 10+034 × 10+043 × 10+04
-Worst 5 × 10312 × 1082 × 106;2 × 10154 × 106;6 × 10144 × 108-Worst 2 × 10+042 × 10+071 × 10+041 × 10+049 × 10+048 × 10+04
-SD1 × 10315 × 1096 × 1074 × 1016;8 × 1071 × 10147 × 109-SD5 × 10+034 × 10+06;3 × 10+033 × 10+033 × 10+042 × 10+04
F31Best2 × 10305 × 1081 × 1033 × 1078 × 1082 × 10133 × 1011F34Best8 × 1016;3 × 1071 × 1091 × 1044 × 1052 × 106
-Avg9 × 10222 × 10+033 × 1023 × 106;2 × 1022 × 1021 × 106;-Avg8 × 1034 × 1022 × 1075 × 1042 × 1021 × 102
-Worst 3 × 10207 × 10+041 × 1018 × 106;2 × 1013 × 1011 × 105-Worst 8 × 1024 × 1014 × 106;1 × 1038 × 1028 × 102
-SD5 × 10211 × 10+042 × 1022 × 1064 × 1026 × 1023 × 106-SD3 × 1028 × 1026 × 1073 × 1043 × 1023 × 102
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Abdel-Basset, M.; Mohamed, R.; Mohammad, N.; Sallam, K.; Moustafa, N. An Adaptive Cuckoo Search-Based Optimization Model for Addressing Cyber-Physical Security Problems. Mathematics 2021, 9, 1140. https://doi.org/10.3390/math9101140

AMA Style

Abdel-Basset M, Mohamed R, Mohammad N, Sallam K, Moustafa N. An Adaptive Cuckoo Search-Based Optimization Model for Addressing Cyber-Physical Security Problems. Mathematics. 2021; 9(10):1140. https://doi.org/10.3390/math9101140

Chicago/Turabian Style

Abdel-Basset, Mohamed, Reda Mohamed, Nazeeruddin Mohammad, Karam Sallam, and Nour Moustafa. 2021. "An Adaptive Cuckoo Search-Based Optimization Model for Addressing Cyber-Physical Security Problems" Mathematics 9, no. 10: 1140. https://doi.org/10.3390/math9101140

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop