Next Article in Journal
A New Method to Focus SEBs Using the Periodic Magnetic Field and the Electrostatic Field
Next Article in Special Issue
A WiFi-Based Sensor Network for Flood Irrigation Control in Agriculture
Previous Article in Journal
40 GHz VCO and Frequency Divider in 28 nm FD-SOI CMOS Technology for Automotive Radar Sensors
Previous Article in Special Issue
A Real-Time Detection Algorithm for Kiwifruit Defects Based on YOLOv5
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Soil Erosion Prediction Based on Moth-Flame Optimizer-Evolved Kernel Extreme Learning Machine

1
College of Computer Science and Technology, Jilin University, Changchun 130012, China
2
Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun 130012, China
3
Chengdu Kestrel Artificial Intelligence Institute, Chengdu 611730, China
4
College of Computer Science and Artificial Intelligence, Wenzhou University, Wenzhou 325035, China
5
Department of Computer Science, Birzeit University, P.O. Box 14, West Bank, Birzeit 627, Palestine
6
Department of Information Technology, College of Computers and Information Technology, P.O. Box 11099, Taif University, Taif 21944, Saudi Arabia
*
Authors to whom correspondence should be addressed.
Electronics 2021, 10(17), 2115; https://doi.org/10.3390/electronics10172115
Submission received: 3 August 2021 / Revised: 24 August 2021 / Accepted: 26 August 2021 / Published: 31 August 2021
(This article belongs to the Collection Electronics for Agriculture)

Abstract

:
Soil erosion control is a complex, integrated management process, constructed based on unified planning by adjusting the land use structure, reasonably configuring engineering, plant, and farming measures to form a complete erosion control system, while meeting the laws of soil erosion, economic and social development, and ecological and environmental security. The accurate prediction and quantitative forecasting of soil erosion is a critical reference indicator for comprehensive erosion control. This paper applies a new swarm intelligence optimization algorithm to the soil erosion classification and prediction problem, based on an enhanced moth-flame optimizer with sine–cosine mechanisms (SMFO). It is used to improve the exploration and detection capability by using the positive cosine strategy, meanwhile, to optimize the penalty parameter and the kernel parameter of the kernel extreme learning machine (KELM) for the rainfall-induced soil erosion classification prediction problem, to obtain more-accurate soil erosion classifications and the prediction results. In this paper, a dataset of the Vietnam Son La province was used for the model evaluation and testing, and the experimental results show that this SMFO-KELM method can accurately predict the results, with significant advantages in terms of classification accuracy (ACC), Mathews correlation coefficient (MCC), sensitivity (sensitivity), and specificity (specificity). Compared with other optimizer models, the adopted method is more suitable for the accurate classification of soil erosion, and can provide new solutions for natural soil supply capacity analysis, integrated erosion management, and environmental sustainability judgment.

1. Introduction

With over 98.8% of the world’s human food coming from the land and less than 1.2% from marine, aquatic ecosystems, protecting arable land and maintaining soil fertility is vital to human well-being [1,2]. Soil erosion is one of the most critical threats to the world’s food production [3,4,5]. Globally, about ten million hectares of arable land are lost each year due to soil erosion, resulting in less arable land available for world food production [6]. The loss of arable land is a serious problem; according to the World Health Organization, and Food and Agriculture Organization of the United Nations (FAO) reports, two-thirds of the global population is still undernourished [7]. Soil erosion is the process by which soil particles and surface appurtenances are eroded, moved, and deposited by the interaction of a number of factors [8]. Soil erosion can be divided into hydraulic erosion, cultivation erosion, wind erosion, freeze–thaw erosion, and gravity erosion [9,10]. Soil erosion leads to numerous serious hazards, directly causing soil acidification, soil sanding, soil consolidation, and water pollution.
In contrast, soil erosion will diminish the productivity of terrestrial ecosystems; soil erosion will result in increased water runoff, which decreases water infiltration and the soil’s water storage capacity. In addition, during erosion, organic matter and essential plant nutrients are reduced, and the depth of the soil nutrient layer is reduced; all these changes can depress plant growth, and decrease valuable biota and the overall biodiversity of the soil [11,12,13]. These factors interact to create soil erosion, which is a major environmental problem shared globally. China is one of the countries with the most serious soil erosion, with 4.92 million km2 of soil erosion nationwide, accounting for 51% of the total land area, including 1.79 million km2 of hydraulic erosion, accounting for 36% of the total soil erosion area [11]. The most significant threat to soil erosion in the country is hydraulic erosion. The main cause of hydraulic erosion is rainfall, where raindrops hit the soil and loosen the soil particles, and where the soil tilt deviates by a percentage of 2%; the soil starts to move downhill, and the impact of erosion is felt on all slopes, with more topsoil being carried away as water moves downslope into valleys and streams, causing erosion [14]. The accurate prediction of soil erosion is key to help address soil conservation and management efforts, and is an important guide to quantitative soil erosion prediction, soil and water planning, and other integrated soil erosion management.
There are many causes of soil erosion, such as geological features, climate, soil, vegetation, hydrology, and many other factors that influence it. A large number of research scholars, at home and abroad, have conducted studies on soil erosion, and the models that are currently available, such as the universal soil loss equation (USLE) model [15], Revised Universal Soil Loss Equation (RUSLE) model [16], AGricultural Non-Point Source Pollution (AGNPS) model [17], Annualized Agricultural Non-Point Source Pollutant (AnnGNPS) model [18], water erosion prediction project (WEPP) model [19], and artificial neural network (ANN) model [20,21] and Artificial Neural Network (ANN) model [22,23,24], etc. Traditional models have poor computational accuracy and robustness, and do not meet the criteria of real needs. It can be observed that the research into soil erosion has moved from simple models to machine learning models, and in recent years there have been many new machine learning classification methods for soil erosion problem prediction. Dinh et al. [25,26] proposed the following two methods: the first is a proposed method for predicting soil erodibility based on a combination of multivariate adaptive regression splines and the social spider algorithm; the other is a method for predicting soil erodibility based on adaptive differential evolution and support vector classification. Fathizad et al. [27] put forward a random forest model with a set of covariates that were used to model the spatial and temporal dynamics of soil quality in the central Iranian desert, where the coefficient of determination between the soil quality index and the covariates was set to 0.69. Chen et al. [28] developed the different predictive performance of the boosted linear model (BLM), boosted regression tree, boosted generalized linear model, and deep boosting models, for piping erosion susceptibility mapping in Zarandieh watershed, which is located in the Markazi province of Iran. Chowdhuri [29] investigated gully erosion susceptibility maps using the machine learning algorithms, including the boosted regression tree, Bayesian additive regression tree, support vector regression, and the ensemble of the SVR-Bee algorithm. Lee et al. [30] used decision tree, K-nearest neighbors, random forest, gradient boosting, extreme gradient boost, and the deep neural network to evaluate the rainfall erosivity factor estimation. Nguyen et al. [31] proposed multivariate adaptive regression splines and random forest, and the boosting method includes cubist and gradient boosting machines.
Although research in machine learning has been carried out in various fields for many years, it is evident, in the soil erosion classification problem, that its application is still in relatively simple machine learning classification, such as support vector machine (SVM) [32], k-Nearest Neighbor (KNN) [33], Random Forest (RF) [34], Gradient boosting (GB) [35] and Extreme Gradient Boost (EGB) [36] etc. The KELM is a recent classification prediction model proposed by G. Huang et al. [37] in 2011. The extreme learning machine introduces the idea of the kernel, which guarantees a better generalization performance, similar to SVM but with better generalization ability than SVM. It is a derivative of Extreme Learning Machine (ELM) [38], a class of machine learning systems or methods built on Feedforward Neuron Network for supervised and unsupervised learning problems. Shan et al. [39] suggested a hybrid artificial fish particle swarm optimizer and KELM for the type-II diabetes predictive model. Cai et al. [40] proposed Label Rectification Learning through KELM. Zhang et al. [41] presented a prospective bankruptcy prediction model based on LSEOFOA and KELM. Yu et al. [42] proposed an improved butterfly optimizer algorithm optimized KELM model that was used to timely and accurately offered a dependable basis for identifying the rolling bearing condition in the real production application. Shan et al. [43] proposed that WEMFO was also applied to train KELM; the resultant optimized WEMFO-KELM model was used to solve six clinical disease classification problems.
As we can observe from the above study, machine learning is constrained by parameter settings. Swarm intelligence optimization algorithms are usually used as a good candidate to optimize the parameters of individual machine learning classifiers, to obtain higher classification accuracy. The swarm intelligence optimization algorithm simulates the behavior of groups of insects, animals, birds, and fish, which cooperatively search for food [44,45,46,47,48,49]. Each group member constantly changes the direction of their search by learning from its own experience and the experience of other members [50,51,52,53,54,55,56,57,58,59]. Any algorithm or distributed problem-solving strategy inspired by insect groups or other mechanisms of animal social behavior is part of swarm intelligence [60,61,62,63]. These optimizers can be generally classified based on many criteria [64,65]. The classical group-wise optimization algorithms are the firefly algorithm (FA) [66], Runge Kutta method (RUN) (https://aliasgharheidari.com/RUN.html (accessed on 25th August 2021)) [67], gravitational search algorithm (GSA) [68], whale optimizer (WOA) [69,70], moth-flame optimizer (MFO) [69], bat algorithm (BA) [71], Harris hawks optimizer (HHO) (https://aliasgharheidari.com/HHO.html (accessed on 25 August 2021)) [72], fruit fly optimization algorithm (FOA) [73,74,75], slime mould algorithm (SMA) (https://aliasgharheidari.com/SMA.html (accessed on 25 August 2021)) [76], Hunger games search (HGS) (https://aliasgharheidari.com/HGS.html (accessed on 25 August 2021)) [77], differential evolution (DE) [78], continuous ant colony optimization (ACOR) [79], multi-verse optimizer (MVO), particle swarm optimizer (PSO) [80,81], simulated annealing algorithm (SA) [82,83], sine cosine algorithm (SCA) [69], and grasshopper optimization algorithm(GOA) [69,84], etc. However, we should notice that some of these methods’ originality, such as GWO, BAT, and FA, is not high and criticized in several papers [44,85,86]. Meanwhile, there are many corresponding improvement algorithms, such as enhanced comprehensive learning particle swarm optimizer(GCLPSO) [87], random spare ant colony optimization (RCACO) [88], enhanced whale optimizer with associative learning (BMWOA) [89], enhanced GWO with a new hierarchical structure (IGWO) [48], hybridizing grey wolf optimization (HGWO) [90], boosted GWO (OBLGWO) [91] and ant colony optimizer with random spare strategy and chaotic intensification strategy (CCACO) [88], etc. MFO is a novel meta-heuristic algorithm for solving optimization problems. The main inspiration for this optimization is the method of navigation used by moths in nature, namely, lateral orientation. Night moths fly by maintaining a fixed angle relative to the moon, which is a very efficient mechanism for moving long distances in a straight line. The MFO algorithm is widely used in many engineering and optimization problems, but the principles and structure of the MFO algorithm are relatively simple [92,93]; it suffers from the pool exploration problem and is prone to falling into local or deceptive optimization (LO) during successive iterations. Many researchers have recently worked on adding improved mechanisms based on the MFO algorithm to address these problems. This paper uses a novel SMFO [94] algorithm that introduces a sine–cosine strategy into MFO, thus further improving the detection capability. The exploratory and exploitative nature of the method and the convergence pattern are significantly improved, and are validated in engineering optimization problems.
The framework of this paper is schematically shown in Figure 1 below. This paper uses the SMFO-KELM approach, choosing a dataset of 236 samples from the Son La province of Vietnam, and constructing a prediction and validation model using ten explanatory factors as features. Firstly, we use the five-fold crossover method for optimizing the parameter settings of the KELM; secondly, we use the ten-fold crossover validation method for classifying soil erosion predictions; and finally, we compare six original algorithm models, such as BA-KELM models, and four improved algorithm models, such as CLOFOA-KELM. The experimental results show that the adopted SMFO-KELM method can obtain much higher soil erosion classification prediction results.
In summary, the main contributions of this paper are as follow:
1. A new and improved swarm intelligence optimization algorithm SMFO combined with the machine learning model KELM is proposed.
2. SMFO is applied for the first time, applied to optimize and determine key parameters in KELM.
3. The first application of SMFO-KELM to a soil erosion classification prediction model.
4. SMFO-KELM classification prediction results of soil erosion are significantly higher than other algorithms, in the following four aspects: accuracy, Matthews correlation coefficient, sensitivity, and specificity.
The chapters of this paper are structured as follows: Section 2 presents the materials and methods, mainly including the methods SMFO, KELM, and the dataset. Section 3 presents the experimental results and evaluation indicators. Section 4 is the discussion and outlook section.

2. Materials and Methods

2.1. Dataset

This paper uses a soil dataset from two limited experimental areas eroded by heavy rainfall in the north-western city of Vietnam—Son La Province, during three years from 2009 to 2011 [26], shown in Appendix A Table A1. The area has a tropical monsoon climate with higher levels of soil erosion from heavy rainfall than at other latitudes, so the area chosen as the site for the experiment has apparent contrasting data. The 4 m × 18 m plots were selected as unit plots, and there are 24 such plots; the shape of the plots was chosen randomly without restriction. In order to ensure the accuracy of data acquisition, the explanatory factor data were acquired using surface runoff subsurface set water pipes, OC measurements with the carbonate component removed followed by a C/N analyzer, transect methods for coverage, and interpolation techniques for residues, respectively. Cultivation methods, fertilizer application, and soil conservation measures are all based on the traditional farming practices of local farmers. Based on the multi-model for soil erosion prediction as a comprehensive reference and the experimentally obtained data, the following ten explanatory variables were identified as explanatory factors for conducting soil erosion classification, as shown in Table 1 below.
Where EI30 is the extended peak rate of disengagement and runoff over 30 min. The following formula gives the storm energy E:
E = 1099 [ 1 0.72 exp ( 1.27 i ) ]
where i is the 30-min maximum intensity. The multiplication of E and I 30 provided the dynamic rainfall energy (EI), a combination of the total and peak intensities in each storm. This number represents the combination of particles detachment and transport capabilities.
Slope degree is the degree of slope in the terrain, i.e., the length and gradient of the slope. It is a key factor in soil erosion. The steeper and longer the slope, the more runoff accumulation it causes and the greater the probability of soil erosion. These data were collected using a Nikon Forestry (550) inclinometer to measure the slope of the plots.
Soil erodibility is also affected by permeability, structure, organic materials, and pH value. Two simple soil characteristics, OC (organic carbon) and pH, were used to apply interpretive parameters to soil erodibility. OC was obtained with a C/N analyzer (minus HCL), and pH was measured with a glass electrode using water-to-soil ratio of 2.5:1.
Bulk density, topsoil porosity, topsoil texture (silt fraction, clay fraction, sand fraction), and soil cover rate are important influencing factors normally used by traditional models [95].
In the model prediction of soil erodibility, we ultimately aim to classify the samples (related to each vector of the value of explanatory variables) into two categories, namely, the following: “erosion category” or “non-erosion category”. To ensure the accuracy of the experimental classification, we used the same criteria for soil loss as in [96], with samples that lost more than 3 tons of soil per hectare being defined as ‘erosive’ and vice versa as ‘non-erodible’. The experimental data consisted of 236 data samples, with 50% erosion and 50% non-erosion.

2.2. KELM

Kernel function-based extreme learning machine (KELM), the extreme learning machine (ELM) algorithm, was combined with a kernel function to replace the feature mapping of the implicit layer in ELM with a kernel function to form a kernel function-based ELM algorithm. Because of the kernel function, the data features were up-dimensioned and therefore could be divided more precisely.
ELM is a novel fast learning algorithm of single hidden-layer neural networks that randomly initialize the input weights and biases, and obtains the corresponding output weights. For every single hidden-layer neural network, assume that there are N arbitrary samples ( x i , t i ) , where X i = [ x i 1 , x i 2 , , x i n ] T R n ,   t i = [ t i 1 , t i 2 , , t i m ] T R m . A single hidden L layer neural network with one hidden layer node can be represented as follows:
  i = 1 L β i g ( W i · X j + b i ) = o j   , i = 1 , 2 , , N ; j = 1 , 2 , , N
where g ( x ) is the activation function, W i = [ w i , 1 , w i , 2 , , w i , n ] T is the input weight, β i is the output weight and b i is the bias of the i th hidden-layer unit. W i · X j denotes the inner product of W i and X j .
Minimal error in the output is the goal of single hidden-layer neural network learning, expressed as follows:
j = 1 N | | o j t j | | = 0  
β i , w i , and b i exist, such that the following applies:
i = 1 L β i g ( W i · X j + b i ) = t j   j = 1 , 2 , , N
It can be matrixed as follows:
H β = T
where H is the hidden-layer node output, β is the weight of output, and T is the output of desired.
H ( W 1 , W 2 , , W L , b 1 , b 2 , , b L , X 1 , X 2 , , X L ) = [ g ( W 1 · X 1 + b 1 ) g ( W L · X 1 + b L ) g ( W 1 · X N + b 1 ) g ( W L · X N + b L ) ] N × L
where β = [ β 1 m , β 2 m , , β L m ] T ,   T = [ t 1 m , t 2 m , , t N m ] T .
In order to be able to train a single hidden-layer neural network, we want to obtain W ^ i ,   b ^ i , and the following:
| | H ( W ^ i , b ^ i ) β ^ i T | | = min W , b , β | | H ( W i , b i ) β i T | |
where i = 1 , 2 , , N , which is equivalented to minimizing the loss function.
E = j = 1 N ( i = 1 L β i g ( W i · X j + b i ) t j ) 2
Some traditional gradient descent-based algorithms can solve such problems, but the basic gradient-based learning algorithms require all parameters to be adjusted during the iterative process. In the ELM algorithm, once the input weights W i and the bias of the hidden layer b i are determined randomly, the output matrix of the hidden layer H is determined uniquely. Training a single hidden-layer neural network could be transformed into solving a linear system H β = T . Moreover, the output weights β can be determined as follows:
    β ^ = H + T
where H + is the Moore–Penrose generalized inverse of the matrix. Meanwhile, there is H + = H T ( H H T ) 1 . Moreover, it can be shown that the norm of the resulting solution β ^ is minimal and unique. As a result, it can help achieve a powerful performance in generalization and significantly increase learning speed.
Kernel-based ELM was proposed in order to improve the ability of the ELM to generalize and outperform the least square-based ELM, and it is proposed to add a positive constant C to the diagonal of H T , which is used to calculate the output weight β , we have the following:
    β = H T ( I C + H H T ) 1 T
where the coefficient C is the penalty parameter, whereas I is the identities matrix.
Hence, the output function is defined below:
f ( x ) = h ( x ) H T ( I C + H H T ) 1 T
A kernel matrix of the ELM is obtained by the following:
Ω E L M = H H T :   Ω E L M i , j = h ( x i ) h ( x j ) = K ( x i , x j )
where K ( x i , x j ) is one kind of kernel function. For the output function, then there is the following:
f ( x ) = [ K ( x , x 1 ) K ( x , x N ) ] T ( I C + Ω E M L ) 1 T
The kernel implementation of the ELM, called KELM, has better stability and generalization capabilities than the basic ELM. The structure of the KELM model is schematically shown in Figure 2, where the kernel function acts as an alternative feature mapping function used to achieve the same mapping from the information input to the feature space. Hence, the neural network’s output is independent of the feature mapping of the hidden layer, but depends on the kernel function, which is explicitly provided. Both the feature mapping of the hidden layer and the dimensionality of the feature space are not pre-defined.
In this paper, use the Gaussian kernel function as the kernel function of KELM with the following formula:
K ( u , v ) = e x p ( γ | | u v | | 2 )
the penalty parameter C and the kernel parameter γ are two critical parameters in the KELM model. The penalty parameter C defines the balancing act between the minimal fitting error and the model’s sophistication, and the kernel parameter γ determines the non-linear mapping from the input space to a specific high-dimensional hidden-layer feature space. In general, these two main parameters can be optimized by using appropriate optimization algorithms in order to improve the performance of KELM better.
KELM is widely used to solve problems in areas such as parameter optimization and model prediction because of its significant advantages in learning speed and generalization ability.

2.3. SMFO

2.3.1. MFO

The basic MFO was proposed in 2015 [47], intended to be a swarm intelligence optimization algorithm based on moths’ spiral flight path mechanism in flight.
MFO is an evolution of the moth lateral positioning navigation mechanism found in nature. At night, moths fly using the distant moon as a reference, which can be considered as parallel light, and the moths adjust their flight direction according to the direction of the light to the angle between themselves. Due to the proximity of the artificial flame, the moths fly at a fixed angle to the flame, and the distance between the moth and the flame changes continuously, eventually producing a flight path that spirals closer to the flame [97]. The MFO algorithm has strong parallel optimization capabilities and good overall properties for non-convex functions; the MFO algorithm can explore the search space extensively and find regions with a greater probability of global optimality, as non-convex functions have a large number of local optimality points [98,99,100].
By definition, moths and flames are two important components of the MFO algorithm. We can observe this from Figure 3 above. The moths fly in d -dimensional hyper plane ( d = 1 , 2 , 3 ), search agents store their position in the matrix M , and store the fitness value of each moth in array O M . The flames are the best position that the moth had reached so far, then they are stored in matrix F . The fitness values of flames are stored in array O F . Every moth updates its position depend on its flame, then the equation is as follows:
M i = S ( M i , F j )
where M i indicate the i -th moth, F j is the j -th flame after sorting, and S is the spiral function. This spiral function should fulfill the conditions below:
(1) The vector position of the initial point of the S function needs to be given first before the MFO algorithm can perform the corresponding calculation.
(2) Before the end of each iteration of the MFO algorithm, the S function should preserve the location of the optimal solution found in this iteration.
(3) The function’s magnitude is between the upper bound vector ub and the lower bound vector lb. Considering these points, the equation is defined as follows:
S ( M i , F j ) = D i · e b t · cos ( 2 π t ) + F j
where b is logarithmic helix shape constant, t is values in the range [−1,1], D i is the distance of the i-th moth to the j-th flame, it can be calculated as follows:
D i = | F j M i |
The t parameter determines the step size of the moth’s next movement. Equation (16) has limitations, as it only defines how the moth flies towards the flame, which makes the MFO algorithm easily fall into a local optimum. To avoid this problem, an adaptive update of the flame is required, and the number of flames is gradually reduced, reducing the computation time and improving the operation efficiency. The updated formula of the flame is shown in Equation (18) as follows:
f l a m e n o = round ( N k N 1 T )
where k is the number of current iterations, N is the maximum flame counts, and T indicates the maximum iterations. When the end-of-iteration condition is satisfied, the best moth is returned as the best obtained value.

2.3.2. SMFO

SMFO [94] improves the global exploration capability of MFO by incorporating the SCA, which increases the diversity of initial solutions and frees the solutions from local distress. At the same time, the adjustment parameters in the positive cosine strategy increase the accuracy of the optimal solution.
The core of the sine and cosine strategy (SCA) is to modify the position of the initial state through the change in the mathematical function [101,102,103], which is shown in Figure 4. The update of individual positions in the population relies on changes in the value of the sine and cosine function to randomly update the position of each individual in each iteration by using a multi-parameter adjustment, to ensure that the population remains diverse in the early stages and that individuals tend to localize in the later stages, eventually converging to the optimal solution. During each iteration, the state of the individual is updated using the following formula:
X i t + 1 = { X i t + r 1 × sin ( r 2 ) × | r 3 P i t X i t | , r 4 < 0.5 X i t + r 1 × cos ( r 2 ) × | r 3 P i t X i t | , r 4 0.5
where X i t is the position of the location of the current solution in i -th dimension at t -th iteration (solution), P i t is the position of the location of the current optimal solution in i -th dimension at t -th iteration (destination), whereas |   | denotes the absolute value.
The parameter r 1 defines whether the next position is searched between the solution and the destination or beyond. It enhances the MFO algorithm global exploration capability. Parameter r 2 determines the next position update step. r 3 is a random weight with a range of values that decide the impact of the target destination on the current solution. r 4 is the random probability of switching between the sine and cosine function. The recurring pattern of sine and cosine functions makes one solution relocate around another. To ensure the use of the space that is identified between the two solutions, Equation (20) is introduced as follows:
r 1 = a t a T
where t is the current number of iterations, T is the maximum number of iterations, and a is a constant, generally set to 2 . This formula can adaptively adapt the size of the parameters for the exploration to gradually converge to the global optimum.

2.4. SMFO-KELM for Soil Erosion Prediction Method

The performance of models in machine learning is often closely related to hyperparameters. Initially, the “optimal” hyperparameter is usually found by manual trial to find the best hyperparameter. However, the approach is inefficient, so swarm intelligence optimization has been proposed to find the optimal hyperparameters. From the experiments mentioned above comparing SMFO with other swarm intelligence optimization algorithms, it can be observed that the proposed SMFO is significantly better than other similar algorithms, in terms of exploration and detection capability, with competitive convergence and balance effect, and it has undeniable advantages in the optimization of SMFO in engineering problems [94]. The penalty parameters and kernel parameters of the SMFO optimized machine learning method kernel extreme learning machine are used to make more-accurate classification predictions for soil erosion classification prediction. Figure 5 shows the proposed SMFO-KELM soil erosion classification prediction model flowchart, which is mainly applied in the following two processes: model optimization and classification evaluation. As with the machine learning validation approach, to obtain reliable and unbiased results, the validation of the classification evaluation model uses a ten-fold crossover to evaluate the classifier’s performance, where nine are the test set, and one is the validation set. At the same time, the process of optimizing the two parameters of the classifier uses a five-fold crossover validation, where five are the test set, and five are the validation set. This experimental scheme can help to obtain unbiased estimates of generalization accuracy and reliable results. The final evaluation of the metrics is carried out by accuracy (ACC), Matthews correlation coefficient (MCC), sensitivity, specificity. Due to random sampling, single 10-fold cross-validation will not be representative of the accuracy of the classification. Therefore, the results of 10 10-fold crossover runs are run for all methods to be averaged as the final result of the evaluation.
Two hundred and thirty-six soil erosion binary classification datasets from Son La city of Vietnam were used to evaluate the SMFO-KELM model with ten explanatory factors of EI30, slope degree, OC topsoil, pH topsoil, bulk density, topsoil porosity, topsoil texture (silt fraction), topsoil texture (clay fraction), topsoil texture (sand fraction), and soil cover rate as factors for classification assessment and SMFO for the KELM hyperparameter selection stage optimization of the penalty parameter c and the kernel parameter γ.

2.5. Experimental Environment

To ensure the fairness and validity of the experiments [62,64,104,105], all the algorithms involved in the comparison in all experiments were conducted under the same experimental conditions. The population size was set to 20, the maximum number of evaluations MaxFEs was uniformly set to 100, and all algorithms were tested 30 times independently to reduce the influence of random conditions. The searching spaces of the two hyperparameters in KELM were set to C { 2 15 , , 2 15 } and γ { 2 15 , , 2 15 } . All experimental results were evaluated using box plots with the following four metrics: ACC, MCC, sensitivity, and specificity.
All experiments were performed on a computer with a 3.40 GHz Intel® Core i7 processor and 16 GB RAM; the coding was conducted using Matlab2018b.

2.6. Measures for Performance Evaluation

For a binary classification problem, the actual values are only positive and negative, and the actual predicted results will only have two values, 0 and 1. If an instance is positive and is predicted to be positive, it is true positive; if it is negative and is predicted to be positive, it is false positive, and if it is negative and is predicted to be negative, it is a true negative, and if it is positive and is predicted to be negative, it is a false negative. The most widely used classifications based on the above are ACC, MCC, sensitivity, and specificity [106,107], which are used to assess the quality of the binary classification and evaluate the proposed method’s performance.
A C C = T P + T N T P + F P + F N + T N × 100 %
M C C = T P × T N F P × F N ( T P + F P ) × ( T P + F N ) × ( T N + F P ) × ( T N + F N ) × 100 %
s e n s i t i v i t y = T P T P + F N × 100 %
s p e c i f i c i t y = T N F P + T N × 100 %
The MCC is essentially a correlation coefficient between the actual classification and the predicted classification, which can range from a value of 1 , which indicates a perfect prediction of the subject, to a value of 0 , which indicates that the prediction is not a good random prediction, and −1, which means that the predicted classification and the actual classification do not agree at all. Sensitivity (also known as true positive rate) is the proportion of samples that are positive that are judged to be positive, and specificity (also known as true negative rate) is the proportion of samples that are actually negative that are judged to be negative.

3. Results

Figure 6 shows the results of a comparison between SMFO-KELM and the six classical primitive optimization algorithm classifiers MFO-KELM, BA-KELM, GSA-KELM, MVO-KELM, GOA-KELM, and WOA-KELM, on a two-classification dataset of 236 features, consisting of 10 features in the Son La province. These boxplots show that SMFO-KELM obtained the best performance, in terms of ACC, MCC, sensitivity, and specificity. The values obtained by BA-KELM, for all four metrics, ranked last. Furthermore, MVO-KELM produced lower values for all four metrics, in terms of ACC, MCC, and specificity, except for sensitivity, which means that MVO-KELM may be less effective in predicting soil erosion than the other comparative methods. Furthermore, WOA-KELM produced better means than BA-KELM, MFO-KELM, GSA-KELM, GOA-KELM, and MVO-KELM, in terms of ACC, MCC, and specificity.
Figure 7 shows the performance of SMFO-KELM against four other advanced and improved algorithmic classifiers in four metrics. It can be observed, very clearly, that SMFO-KELM performs better than the other competitors in all four metrics. Except for SMFO-KELM, HGWO-KELM outperformed the other ACC, MCC, and sensitivity competitors, followed by OBLGWO-KELM, IGWO-KELM, and CLOFOA-KELM, who achieved average values, in terms of ACC, MCC, sensitivity, and specificity, which were lower than the other classifiers. This shows that the CLOFOA-ELM classifier is not a good choice for soil erosion classification problems.
It is clear from Figure 6, Figure 7 and Figure 8 that the SMFO-KELM classifier generally outperforms the other competing models in comparing classical optimized and advanced algorithmic classifiers, since the SMFO optimizer that was used has the highest optimization power. All the experimental results can be viewed in Appendix A Table A2. The improved KELM using HGWO does not achieve champion classification performance; nevertheless, it is a second-top optimizer in the competition. As per the features of the proposed model, the efficacy of the proposed MFO method can be further investigated in dealing with more complex problems, such as social recommendation and QOS-aware service composition [108,109,110], energy storage planning and scheduling [111], image editing [112,113,114], service ecosystem [115,116], epidemic prevention and control [117,118], active surveillance [119], large scale network analysis [120], pedestrian dead reckoning [121], and evaluation of human lower limb motions [122].

4. Conclusions

The use of swarm intelligence optimization algorithms to optimize machine learning parameters is becoming more widely used in the study of classification problems. These types of machine learning algorithms perform better than the original machine learning models. This paper proposes a robust and accurate machine learning method, SMFO-KELM, effectively solving the soil erosion prediction problem. The model’s main idea is to apply a new and improved MFO algorithm, SMFO, by optimizing the penalty parameter c of the KELM and the generalization capability of the kernel parameter γ classifier. The improved SMFO is proposed after integrating the positive cosine mechanism in the original MFO. This approach provides higher performance, in terms of consistency in global optimization, improves the balance between exploration and exploitation, and increases the convergence speed.
From the results of the experiments in this paper, it can be concluded that, for the discrete soil data classification problem, the SMFO-KELM model is significantly superior compared to the MFO-KELM model. It can be observed that the positive cosine strategy that was used by SMFO in the improvement strategy, has a positive effect on optimizing the kernel limit learning machine parameter optimization; comparing this with other algorithms, such as BA-KELM, CLOFOA-KELM, IGWO-KELM, and OBLGWO-KELM models, it can be observed that SMFO-KELM outperforms several other classifier models in solving soil erosion classification problems in four commonly used performance metrics. Therefore, it can be derived that the usability of SMFO-KELM has been extended, and the proposed method can be considered as a valuable early warning tool for soil erosion prediction systems, helping land management agencies to make scientifically accurate decisions.
In addition, soil erosion prediction models can be combined with other optimization algorithms, and the SMFO used can also be used for parameter tuning of other machine learning models, such as KNN, support vector machine, and convolutional neural networks, and can also be applied to deal with pest and disease image segmentation, feature selection problems. Other potential applications, such as fertilizer effect function optimization, reservoir regulation optimization, and combined irrigation and groundwater optimal allocation, are also exciting topics for green sustainability in agricultural engineering. More agricultural engineering optimization problems will continue to be investigated in the future.

Author Contributions

Conceptualization, C.C. and H.C.; methodology, H.C., M.M.; software, C.C.; validation, H.C., C.C. and H.T.; formal analysis, X.W., C.W.; investigation, C.C.; resources, H.C.; data curation, C.C., H.T.; writing—original draft preparation, C.C., C.W.; writing—review and editing, C.C.; visualization, C.C., M.M.; supervision, X.W.; project administration, C.C.; funding acquisition, H.C. and X.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research is supported by the National Natural Science Foundation of China (U19A2061, U1809209, 62076185), Science and Technology Development Project of Jilin Province (20190301024NY), Jilin Provincial Industrial Innovation Special Fund Project (2018C039-3). Taif University Researchers Supporting Project Number (TURSP-2020/125), Taif University, Taif, Saudi Arabia. Thanks to the efforts of Ali Asghar Heidari during the preparation of this research.

Institutional Review Board Statement

The study has not involved humans or animals.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data of the study can be gotten from the published paper.

Acknowledgments

We acknowledge the comments of the editor and anonymous reviewers that enhanced this research significantly.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Soil erosion data.
Table A1. Soil erosion data.
No.a1a2a3a4a5a6a7a8a9a10Type
12044.0127.930.895.831.4755.4733.4730.6935.8442.091
2975.5629.5715.81.4655.2631.3530.4338.2125.911
32044.0128.171.26.591.3751.7534.9327.5737.560.191
4975.5628.171.26.591.3751.7534.9327.5737.58.981
52044.0129.61.466.681.4153.1233.1920.2946.5154.961
61138.3727.930.895.831.4755.4733.4730.6935.8429.741
7975.5628.171.336.381.4855.834.9924.9540.0518.011
82044.0130.21.285.851.452.6936.1927.9535.8651.121
92044.0129.5715.81.4655.2631.3530.4338.2144.351
101138.3729.61.466.681.4153.1233.1920.2946.5123.591
11975.5627.930.895.831.4755.4733.4730.6935.849.651
12976.9228.171.26.591.3751.7534.9327.5737.55.991
13975.5634.771.426.721.3751.8732.7324.3342.9329.91
14975.5629.81.537.061.5859.4836.3118.6145.0830.561
153008.9328.632.395.21.4454.4934.7328.8936.3753.91
161270.4628.632.325.21.4454.4934.7328.8936.3710.541
171472.526.272.415.231.3249.9232.4131.0936.494.851
182044.0134.771.426.721.3751.8732.7324.3342.9347.031
191138.3729.5715.81.4655.2631.3530.4338.2131.981
201270.4628.472.085.361.2547.0931.4734.5933.9312.331
212044.0128.171.336.381.4855.834.9924.9540.0559.981
22975.5630.21.285.851.452.6936.1927.9535.8626.911
23975.5629.61.466.681.4153.1233.1920.2946.517.651
243008.9326.272.185.231.3249.9232.4131.0936.4960.881
251138.3728.171.26.591.3751.7534.9327.5737.527.691
262044.0129.81.537.061.5859.4836.3118.6145.0865.481
273008.9328.032.275.541.349.0733.9330.1535.9244.961
281270.4626.272.415.231.3249.9232.4131.0936.4913.131
29975.5634.170.995.891.3450.6234.3135.3330.3522.071
301138.3734.771.426.721.3751.8732.7324.3342.9344.291
31685.0928.171.26.591.3751.7534.9327.5737.514.971
321138.3728.171.336.381.4855.834.9924.9540.0521.11
333008.9328.472.355.361.2547.0931.4734.5933.9358.071
34685.0929.61.466.681.4153.1233.1920.2946.5112.751
35263.0428.171.26.591.3751.7534.9327.5737.56.741
36550.2228.472.085.361.2547.0931.4734.5933.9324.661
372044.0134.170.995.891.3450.6234.3135.3330.3565.791
381472.528.632.325.21.4454.4934.7328.8936.373.691
39973.5828.171.26.591.3751.7534.9327.5737.59.731
40192.826.272.415.231.3249.9232.4131.0936.4918.751
412503.729.61.466.681.4153.1233.1920.2946.5186.71
42685.0928.171.336.381.4855.834.9924.9540.05191
431138.3730.21.285.851.452.6936.1927.9535.8635.061
448827.930.895.831.4755.4733.4730.6935.8412.061
45550.2228.632.325.21.4454.4934.7328.8936.3721.081
462044.0133.731.066.951.5759.3837.7119.5142.7761.21
47976.9227.930.895.831.4755.4733.4730.6935.846.431
481138.3734.170.995.891.3450.6234.3135.3330.3526.561
491138.3729.81.537.061.5859.4836.3118.6145.0827.731
502503.730.21.285.851.452.6936.1927.9535.8682.311
51973.5828.171.336.381.4855.834.9924.9540.0518.131
52973.5829.61.466.681.4153.1233.1920.2946.518.291
53976.9228.171.336.381.4855.834.9924.9540.0517.521
542503.727.930.895.831.4755.4733.4730.6935.8486.51
553008.9328.371.955.151.2346.3433.0732.0934.8466.181
56550.2226.272.415.231.3249.9232.4131.0936.4926.251
57976.9229.61.466.681.4153.1233.1920.2946.515.11
582503.729.5715.81.4655.2631.3530.4338.2182.91
59263.0429.61.466.681.4153.1233.1920.2946.515.741
60306.1528.632.325.21.4454.4934.7328.8936.3711.061
61180.8228.632.325.21.4454.4934.7328.8936.3716.561
621969.3928.632.395.21.4454.4934.7328.8936.3787.251
63685.0929.81.537.061.5859.4836.3118.6145.0829.651
643008.9324.832.25.971.3751.7934.3534.2531.459.031
65976.9229.5715.81.4655.2631.3530.4338.2124.941
66263.0427.930.895.831.4755.4733.4730.6935.847.231
67973.5827.930.895.831.4755.4733.4730.6935.8410.451
68152.9927.572.075.131.4855.8733.2932.1734.5470.781
692503.728.171.26.591.3751.7534.9327.5737.589.31
7089.5229.61.466.681.4153.1233.1920.2946.516.371
713008.93281.955.131.3751.5333.0731.5135.4264.441
72192.828.632.325.21.4454.4934.7328.8936.3715.061
73973.5829.5715.81.4655.2631.3530.4338.2126.161
743008.9327.92.345.51.3450.6935.8129.2334.9561.51
75685.0927.930.895.831.4755.4733.4730.6935.8416.081
76184.9328.632.325.21.4454.4934.7328.8936.3718.071
7789.5227.930.895.831.4755.4733.4730.6935.848.041
78973.5830.21.285.851.452.6936.1927.9535.8627.241
79263.0428.171.336.381.4855.834.9924.9540.0517.641
802503.734.771.426.721.3751.8732.7324.3342.93871
81685.0930.21.285.851.452.6936.1927.9535.8629.521
82976.9230.21.285.851.452.6936.1927.9535.8625.611
83973.5829.81.537.061.5859.4836.3118.6145.0830.441
843008.9327.572.075.131.4855.8733.2932.1734.5458.331
85550.2224.832.665.971.3751.7934.3534.2531.438.421
86973.5834.170.995.891.3450.6234.3135.3330.3522.251
871270.4628.032.385.541.349.0733.9330.1535.9230.711
88682.4628.632.325.21.4454.4934.7328.8936.3752.71
89685.0929.5715.81.4655.2631.3530.4338.2127.861
9085.7428.632.325.21.4454.4934.7328.8936.3713.551
911472.528.472.085.361.2547.0931.4734.5933.938.731
92685.0934.771.426.721.3751.8732.7324.3342.9334.511
9382.9927.572.085.131.4855.8733.2932.1734.5441.511
9475.0928.632.325.21.4454.4934.7328.8936.372.631
95976.9234.170.995.891.3450.6234.3135.3330.3521.361
968829.61.466.681.4153.1233.1920.2946.519.561
97550.2228.032.385.541.349.0733.9330.1535.9238.421
983008.9326.332.795.911.3751.7534.6934.2931.0166.211
99184.9326.272.415.231.3249.9232.4131.0936.4922.51
10089.5228.171.26.591.3751.7534.9327.5737.57.481
101192.828.032.385.541.349.0733.9330.1535.9234.011
102550.22282.145.131.3751.5333.0731.5135.4229.341
10382.9928.472.155.361.2547.0931.4734.5933.9330.781
104973.5834.771.426.721.3751.8732.7324.3342.9330.481
1058830.21.285.851.452.6936.1927.9535.8627.891
1068828.171.26.591.3751.7534.9327.5737.511.231
107270.9828.472.355.361.2547.0931.4734.5933.9354.141
1088829.5715.81.4655.2631.3530.4338.2126.641
109862.9628.632.395.21.4454.4934.7328.8936.3741.051
110388.729.61.466.681.4153.1233.1920.2946.5186.911
111263.0430.21.285.851.452.6936.1927.9535.8625.931
11289.5234.771.426.721.3751.8732.7324.3342.9328.751
1138834.771.426.721.3751.8732.7324.3342.9331.631
114685.0934.170.995.891.3450.6234.3135.3330.3523.511
115263.0429.5715.81.4655.2631.3530.4338.2125.191
116862.9628.472.355.361.2547.0931.4734.5933.9346.591
11785.7426.272.415.231.3249.9232.4131.0936.4916.881
118388.727.930.895.831.4755.4733.4730.6935.8486.431
119136.4433.731.066.951.5759.3837.7119.5142.7759.122
1200.1333.731.066.951.5759.3837.7119.5142.7766.792
1210.0726.272.395.231.3249.9232.4131.0936.4992.232
12251.2229.61.466.681.4153.1233.1920.2946.5132.172
123152.9932.332.645.61.4153.1133.2135.4731.3278.92
1240.0126.52.35.391.3852.0532.6529.3737.9876.612
125248.3728.032.385.541.349.0733.9330.1535.9267.522
1260.0726.52.35.391.3852.0532.6529.3737.9889.22
1270.0129.81.135.71.3450.5831.9938.3529.6678.392
1280.0627.92.345.51.3450.6935.8129.2334.9578.292
1291.5928.032.255.541.349.0733.9330.1535.9242.492
1300.0226.272.415.231.3249.9232.4131.0936.4912.472
1312.0929.81.135.71.3450.5831.9938.3529.6678.272
1326.8228.632.395.21.4454.4934.7328.8936.3788.052
1330.0127.572.085.131.4855.8733.2932.1734.5454.612
1344.6129.81.135.71.3450.5831.9938.3529.6674.212
13541.9227.92.345.51.3450.6935.8129.2334.9594.482
1360.1334.170.995.891.3450.6234.3135.3330.3568.432
1370.5327.92.365.51.3450.6935.8129.2334.9522.472
1380.1328.171.26.591.3751.7534.9327.5737.513.562
1390.0128.472.085.361.2547.0931.4734.5933.931.762
140299.7833.731.066.951.5759.3837.7119.5142.7787.632
1410.0129.81.537.061.5859.4836.3118.6145.0860.712
1426.7926.332.365.911.3751.7534.6934.2931.0168.942
14346.3434.771.426.721.3751.8732.7324.3342.9329.892
1440.0126.52.335.391.3852.0532.6529.3737.9817.342
1450.2728.472.155.361.2547.0931.4734.5933.9350.562
14610.4232.332.295.61.4153.1133.2135.4731.3212.612
1470.2728.632.395.21.4454.4934.7328.8936.3779.42
1480.3629.81.537.061.5859.4836.3118.6145.0831.352
1496.6729.5715.81.4655.2631.3530.4338.2145.212
150299.7827.930.895.831.4755.4733.4730.6935.8485.912
1519.5928.032.385.541.349.0733.9330.1535.9233.712
1522.5924.832.25.971.3751.7934.3534.2531.490.722
1536.4229.81.537.061.5859.4836.3118.6145.0869.612
15425.430.271.245.721.4554.8733.4327.2139.3668.612
1553.3828.171.336.381.4855.834.9924.9540.0549.222
15612.4128.632.395.21.4454.4934.7328.8936.3728.422
15789.5230.21.285.851.452.6936.1927.9535.8626.262
158152.9927.92.345.51.3450.6935.8129.2334.9573.492
15915.2226.332.795.911.3751.7534.6934.2931.0194.252
160202.3228.171.336.381.4855.834.9924.9540.0525.352
161115.9430.271.245.721.4554.8733.4327.2139.3664.782
1623.9128.032.255.541.349.0733.9330.1535.9266.792
16361.628.171.336.381.4855.834.9924.9540.0540.052
1640.2428.472.085.361.2547.0931.4734.5933.9328.692
165029.5715.81.4655.2631.3530.4338.2124.72
1660.0728.632.365.21.4454.4934.7328.8936.3780.32
1670.0324.832.25.971.3751.7934.3534.2531.495.272
16821.9428.472.085.361.2547.0931.4734.5933.9363.412
16951.2228.171.336.381.4855.834.9924.9540.0528.542
170550.2228.372.055.151.2346.3433.0732.0934.8440.792
1714.2133.731.066.951.5759.3837.7119.5142.7719.052
17264.1530.271.245.721.4554.8733.4327.2139.3670.532
17354.1734.170.995.891.3450.6234.3135.3330.3559.862
174385.1127.572.115.131.4855.8733.2932.1734.5471.112
1750.01281.955.131.3751.5333.0731.5135.4248.22
1760.0327.572.115.131.4855.8733.2932.1734.5428.772
1770.0129.81.135.71.3450.5831.9938.3529.6680.552
1786.4229.5715.81.4655.2631.3530.4338.2147.992
1790.0128.472.355.361.2547.0931.4734.5933.9386.162
1800.0227.930.895.831.4755.4733.4730.6935.8422.842
181248.3724.832.665.971.3751.7934.3534.2531.468.242
1820.1333.731.066.951.5759.3837.7119.5142.7747.472
1831.9628.632.395.21.4454.4934.7328.8936.3710.462
184295.8226.332.795.911.3751.7534.6934.2931.0189.382
18522.4528.032.275.541.349.0733.9330.1535.9257.022
1860.0126.52.335.391.3852.0532.6529.3737.9845.642
187131.0132.332.645.61.4153.1133.2135.4731.3262.692
18825.8226.332.385.911.3751.7534.6934.2931.0143.642
1890.1328.171.336.381.4855.834.9924.9540.0548.412
1900.0132.332.35.61.4153.1133.2135.4731.3257.922
1910.0327.572.075.131.4855.8733.2932.1734.5481.942
1920.1430.21.285.851.452.6936.1927.9535.8647.92
1930.7226.272.415.231.3249.9232.4131.0936.4924.382
1940.0128.032.275.541.349.0733.9330.1535.9270.922
1950.0132.332.295.61.4153.1133.2135.4731.3280.562
1960.3827.92.185.51.3450.6935.8129.2334.9566.792
1970.0126.332.365.911.3751.7534.6934.2931.0163.382
19857.2329.81.135.71.3450.5831.9938.3529.6682.152
1990.5326.52.335.391.3852.0532.6529.3737.9825.722
2006.7429.61.466.681.4153.1233.1920.2946.5117.212
2011.3626.332.795.911.3751.7534.6934.2931.0141.612
2020.0128.372.055.151.2346.3433.0732.0934.8483.992
2030.0330.271.245.721.4554.8733.4327.2139.3681.042
2040.4232.332.645.61.4153.1133.2135.4731.3252.592
2056.8226.332.795.911.3751.7534.6934.2931.0195.32
206180.8228.032.385.541.349.0733.9330.1535.9235.112
207131.0126.332.795.911.3751.7534.6934.2931.0158.842
2080.0128.372.055.151.2346.3433.0732.0934.8417.442
2090.7928.171.26.591.3751.7534.9327.5737.519.322
210766.6326.52.35.391.3852.0532.6529.3737.9874.622
21134.7424.832.665.971.3751.7934.3534.2531.431.812
21217.5828.632.395.21.4454.4934.7328.8936.3711.522
2133.3228.371.955.151.2346.3433.0732.0934.8467.052
21414.81282.145.131.3751.5333.0731.5135.4233.432
215152.9928.632.395.21.4454.4934.7328.8936.3765.342
2160.0328.171.336.381.4855.834.9924.9540.0563.292
2170.134.170.995.891.3450.6234.3135.3330.3594.62
2186.1629.61.466.681.4153.1233.1920.2946.5121.812
2194.3529.61.466.681.4153.1233.1920.2946.5176.42
2200.0128.171.26.591.3751.7534.9327.5737.514.912
221230.3926.52.35.391.3852.0532.6529.3737.9881.912
2220.0124.832.665.971.3751.7934.3534.2531.459.492
2230.0324.832.25.971.3751.7934.3534.2531.484.782
2240.0326.52.075.391.3852.0532.6529.3737.9896.622
2256.8232.332.645.61.4153.1133.2135.4731.3297.642
22661.634.771.426.721.3751.8732.7324.3342.9336.872
2276.1634.170.995.891.3450.6234.3135.3330.3568.712
2289.830.21.285.851.452.6936.1927.9535.8639.132
2291.7928.032.255.541.349.0733.9330.1535.9244.542
23010.4228.632.325.21.4454.4934.7328.8936.371.052
23115.8728.372.075.151.2346.3433.0732.0934.8465.082
2320.42281.955.131.3751.5333.0731.5135.4248.182
23325.429.81.537.061.5859.4836.3118.6145.0866.822
23421.94282.145.131.3751.5333.0731.5135.4264.132
235682.4626.332.385.911.3751.7534.6934.2931.0161.972
2360.38282.175.131.3751.5333.0731.5135.4239.532
Table A2. SMFO and other comparison algorithms 10-fold cross-validation results.
Table A2. SMFO and other comparison algorithms 10-fold cross-validation results.
AlgorithmIndexMaxMinMeanStd#1#2#3#4#5#6#7#8#9#10
SMFOaccs0.9583330.7916670.8943840.0564610.9565220.9565220.9583330.8750.9166670.8695650.9166670.8695650.8333330.791667
SMFOsens10.750.8740730.09928710.9166670.9411760.80.9090910.90909110.750.7647060.75
SMFOspes10.8333330.9231870.065270.923077110.9285710.9230770.8333330.8571430.93333310.833333
SMFOmccs0.9166670.5853690.7892170.1117410.9160570.9166670.9074850.7419410.8321680.7424240.8451540.7073170.6975890.585369
MFOaccs0.9583330.6956520.8509060.0693840.6956520.8750.8695650.8260870.9166670.8260870.8750.8333330.9583330.833333
MFOsens10.5454550.8050580.1297860.5454550.9166670.80.8333330.9166670.750.750.84615410.692308
MFOspes10.8181820.9003350.0707310.8333330.8333330.9230770.8181820.91666710.93750.8181820.9230771
MFOmccs0.9198660.3972760.7069810.1355080.3972760.7526180.7344650.6515150.8333330.6908490.7130240.6643360.9198660.712525
BAaccs10.3750.78750.2134970.7916670.83333310.9130430.9583330.9565220.6086960.9166670.3750.521739
BAsens10.3750.7753320.2303110.7692310.810.810.9090910.710.40.375
BAspes10.3333330.7940770.2289350.8181820.888889110.92857110.5384620.8333330.3333330.6
BAmccs1−0.25820.5723230.4384560.5853690.66934210.8326660.918780.9160570.2384620.845154−0.2582−0.0244
GSAaccs0.9583330.6956520.8554350.0816390.8260870.7916670.9583330.8695650.9583330.7916670.6956520.9130430.8750.875
GSAsens10.60.8429460.1184630.8181820.86666710.8181820.8888890.60.6923080.9166670.90.928571
GSAspes10.6666670.8520560.104880.8333330.6666670.9090910.91666710.9285710.70.9090910.8571430.8
GSAmccs0.918780.3893240.7050870.167550.6515150.5477230.918780.740480.9128710.5733160.3893240.8257580.7491590.741941
MVOaccs0.9583330.6956520.8425720.0836150.8260870.750.9583330.8695650.9583330.7916670.6956520.8260870.8750.875
MVOsens10.60.836280.1188550.8181820.810.8181820.8888890.60.6923080.9166670.90.928571
MVOspes10.6666670.8338740.109550.8333330.6666670.9090910.91666710.9285710.70.7272730.8571430.8
MVOmccs0.918780.3893240.6803140.1719670.6515150.4666670.918780.740480.9128710.5733160.3893240.6590930.7491590.741941
WOAaccs10.750.8601450.0717550.7510.9130430.8333330.8750.8260870.8750.8333330.7826090.913043
WOAsens10.6250.8236890.1282340.62510.90.92307710.7692310.80.7272730.6923080.8
WOAspes10.7272730.8935930.1003580.812510.9230770.7272730.750.910.9230770.91
WOAmccs10.43750.7237570.1534430.437510.8230770.6693420.7745970.6641410.7745970.6693420.5923080.832666
GOAaccs0.9583330.6956520.8554350.0816390.8260870.7916670.9583330.8695650.9583330.7916670.6956520.9130430.8750.875
GOAsens10.60.8429460.1184630.8181820.86666710.8181820.8888890.60.6923080.9166670.90.928571
GOAspes10.6666670.8520560.104880.8333330.6666670.9090910.91666710.9285710.70.9090910.8571430.8
GOAmccs0.918780.3893240.7050870.167550.6515150.5477230.918780.740480.9128710.5733160.3893240.8257580.7491590.741941
CLOFOAaccs0.9583330.6666670.8307970.0808560.7826090.8260870.6666670.8750.9130430.7916670.8695650.7916670.9583330.833333
CLOFOAsens10.7142860.8443160.0996050.7692310.750.80.84615410.7857140.8888890.71428610.888889
CLOFOAspes10.5714290.84210.1143790.810.5714290.9090910.8666670.80.8571430.90.9166670.8
CLOFOAmccs0.9198660.3714290.6723480.1536580.5649020.6908490.3714290.7526180.8326660.5795380.7344650.6078080.9198660.669342
HGWOaccs0.9583330.6956520.8552540.0743980.8695650.8333330.9166670.8695650.9583330.7916670.6956520.8260870.9166670.875
HGWOsens10.60.832830.1393420.7272730.93333310.8181820.8888890.60.6153850.9166670.90.928571
HGWOspes10.6666670.8585930.11340110.6666670.8181820.91666710.9285710.80.7272730.9285710.8
HGWOmccs0.9128710.4153850.7115570.1454780.762770.6390640.8420750.740480.9128710.5733160.4153850.6590930.8285710.741941
IGWOaccs0.9583330.6956520.8425720.0836150.8260870.750.9583330.8695650.9583330.7916670.6956520.8260870.8750.875
IGWOsens10.60.8194970.1349670.7272730.810.8181820.8888890.60.6153850.9166670.90.928571
IGWOspes10.6666670.8522080.1025940.9166670.6666670.9090910.91666710.9285710.80.7272730.8571430.8
IGWOmccs0.918780.4153850.6836780.1670580.6590930.4666670.918780.740480.9128710.5733160.4153850.6590930.7491590.741941
OBLGWOaccs0.9166670.7826090.8471010.0468620.9166670.8695650.7826090.7826090.7916670.8750.8750.8333330.8695650.875
OBLGWOsens0.9166670.6666670.8073260.0780750.8333330.8750.6666670.9166670.7647060.7857140.8181820.7857140.7272730.9
OBLGWOspes10.6363640.8897540.10832510.8666670.8571430.6363640.85714310.9230770.910.857143
OBLGWOmccs0.8451540.5367450.6973670.1021980.8451540.7237930.5367450.5800230.5733160.7774290.7491590.6761230.762770.749159

References

  1. Kopittke, P.M.; Menzies, N.W.; Wang, P.; McKenna, B.A.; Lombi, E. Soil and the intensification of agriculture for global food security. Environ. Int. 2019, 132, 105087. [Google Scholar] [CrossRef]
  2. Litvin, L.F.; Kiryukhina, Z.P.; Krasnov, S.F.; Dobrovol’skaya, N.G.; Gorobets, A.V. Dynamics of Agricultural Soil Erosion in Siberia and Far East. Eurasian Soil Sci. 2021, 54, 150–160. [Google Scholar] [CrossRef]
  3. Moges, D.M.; Bhat, H.G. Watershed degradation and management practices in north-western highland Ethiopia. Environ. Monit. Assess. 2020, 192, 1–15. [Google Scholar] [CrossRef]
  4. Mosavi, A.; Sajedi-Hosseini, F.; Choubin, B.; Taromideh, F.; Rahi, G.; Dineva, A.A. Susceptibility Mapping of Soil Water Erosion Using Machine Learning Models. Water 2020, 12, 1995. [Google Scholar] [CrossRef]
  5. Sankar, M.; Green, S.M.; Mishra, P.K.; Snoalv, J.T.C.; Sharma, N.K.; Karthikeyan, K.; Somasundaram, J.; Kadam, D.M.; Dinesh, D.; Kumar, S.; et al. Nationwide soil erosion assessment in India using radioisotope tracers Cs-137 and Pb-210: The need for fallout mapping. Curr. Sci. 2018, 115, 388–390. [Google Scholar] [CrossRef]
  6. Wang, E.; Xin, C.; Williams, J.R.; Xu, C. Predicting soil erosion for alternative land uses. J. Environ. Qual. 2006, 35, 459–467. [Google Scholar] [CrossRef]
  7. Boardman, J.; Shepheard, M.L.; Walker, E.; Foster, I.D.L. Soil erosion and risk-assessment for on- and off-farm impacts: A test case using the Midhurst area, West Sussex, UK. J. Environ. Manag. 2009, 90, 2578–2588. [Google Scholar] [CrossRef] [PubMed]
  8. Loukrakpam, C.; Oinam, B. Linking the past, present and future scenarios of soil erosion modeling in a river basin. Glob. J. Environ. Sci. Manag. 2021, 7, 457–472. [Google Scholar] [CrossRef]
  9. Liu, X.; Zhou, Z.Q.; Ding, Y.B. Vegetation coverage change and erosion types impacts on the water chemistry in western China. Sci. Total Environ. 2021, 772, 145543. [Google Scholar] [CrossRef] [PubMed]
  10. Wu, X.; Probst, A. Influence of ponds on hazardous metal distribution in sediments at a catchment scale (agricultural critical zone, S-W France). J. Hazard. Mater. 2021, 411, 125077. [Google Scholar] [CrossRef] [PubMed]
  11. Ye, L.M.; Van Ranst, E. Production scenarios and the effect of soil degradation on long-term food security in China. Glob. Environ. Change 2009, 19, 464–481. [Google Scholar] [CrossRef]
  12. Balasubramanian, A. Soil Erosion—Causes and Effects; Centre forAdvanced Studies in Earth Science, University of Mysore: Mysore, India, 2017. [Google Scholar]
  13. Lal, R. Sustainable intensification of China’s agroecosystems by conservation agriculture. Int. Soil Water Conserv. Res. 2018, 6, 1–12. [Google Scholar] [CrossRef]
  14. Martín-Fernández, L.; Martínez-Núñez, M. An empirical approach to estimate soil erosion risk in Spain. Sci. Total Environ. 2011, 409, 3114–3123. [Google Scholar] [CrossRef]
  15. Fan, J.H.; Motamedi, A.; Galoie, M. Impact of C factor of USLE technique on the accuracy of soil erosion modeling in elevated mountainous area (case study: The Tibetan plateau). Environ. Dev. Sustain. 2021, 23, 1–16. [Google Scholar] [CrossRef]
  16. Chuma, G.B.; Bora, F.S.; Ndeko, A.B.; Mugumaarhahama, Y.; Cirezi, N.C.; Mondo, J.M.; Bagula, E.M.; Karume, K.; Mushagalusa, G.N.; Schimtz, S. Estimation of soil erosion using RUSLE modeling and geospatial tools in a tea production watershed (Chisheke in Walungu), eastern Democratic Republic of Congo. Model. Earth Syst. Environ. 2021. [Google Scholar] [CrossRef]
  17. Momm, H.G.; Bingner, R.L.; Wells, R.R.; Wilcox, D. Agnps Gis-Based Tool for Watershed-Scale Identification and Mapping of Cropland Potential Ephemeral Gullies. Appl. Eng. Agric. 2012, 28, 17–29. [Google Scholar] [CrossRef] [Green Version]
  18. Li, J.K.; Li, H.E.; Li, Y.J. Evaluation of AnnAGNPS and its applications in a semi-arid and semi-humid watershed in Northwest China. Int. J. Environ. Pollut. 2012, 49, 62–88. [Google Scholar] [CrossRef]
  19. Arnold, J.G.; Moriasi, D.N.; Gassman, P.W.; Abbaspour, K.C.; White, M.J.; Srinivasan, R.; Santhi, C.; Harmel, R.D.; Van Griensven, A.; Van Liew, M.W.; et al. SWAT: Model Use, Calibration, and Validation. Trans. ASABE 2012, 55, 1491–1508. [Google Scholar] [CrossRef]
  20. Dutal, H.; Reis, M. Identification of priority areas for sediment yield reduction by using a GeoWEPP-based prioritization approach. Arab. J. Geosci. 2020, 13, 1–11. [Google Scholar] [CrossRef]
  21. Singh, A.K.; Kumar, S.; Naithani, S. Modelling runoff and sediment yield using GeoWEPP: A study in a watershed of lesser Himalayan landscape, India. Model. Earth Syst. Environ. 2020, 7. [Google Scholar] [CrossRef]
  22. Mirzaee, S.; Ghorbani-Dashtaki, S.; Kerry, R. Comparison of a spatial, spatial and hybrid methods for predicting inter-rill and rill soil sensitivity to erosion at the field scale. Catena 2020, 188, 104439. [Google Scholar] [CrossRef]
  23. Kouchami-Sardoo, I.; Shirani, H.; Esfandiarpour-Boroujeni, I.; Besalatpour, A.A.; Hajabbasi, M.A. Prediction of soil wind erodibility using a hybrid Genetic algorithm—Artificial neural network method. Catena 2020, 187, 104315. [Google Scholar] [CrossRef]
  24. Mosaffaei, Z.; Jahani, A.; Chahouki, M.A.Z.; Goshtasb, H.; Etemad, V.; Saffariha, M. Soil texture and plant degradation predictive model (STPDPM) in national parks using artificial neural network (ANN). Model. Earth Syst. Environ. 2020, 6, 715–729. [Google Scholar] [CrossRef]
  25. Dinh, T.V.; Nguyen, H.; Tran, X.L.; Hoang, N.D. Predicting Rainfall-Induced Soil Erosion Based on a Hybridization of Adaptive Differential Evolution and Support Vector Machine Classification. Math. Probl. Eng. 2021, 2021. [Google Scholar] [CrossRef]
  26. Vu, D.T.; Tran, X.L.; Cao, M.T.; Tran, T.C.; Hoang, N.D. Machine learning based soil erosion susceptibility prediction using social spider algorithm optimized multivariate adaptive regression spline. Measurement 2020, 164, 108066. [Google Scholar] [CrossRef]
  27. Fathizad, H.; Ardakani, M.A.H.; Heung, B.; Sodaiezadeh, H.; Rahmani, A.; Fathabadi, A.; Scholten, T.; Taghizadeh-Mehrjardi, R. Spatio-temporal dynamic of soil quality in the central Iranian desert modeled with machine learning and digital soil assessment techniques. Ecol. Indic. 2020, 118, 106736. [Google Scholar] [CrossRef]
  28. Chen, Y.Z.; Chen, W.; Janizadeh, S.; Bhunia, G.S.; Bera, A.; Pham, Q.B.; Linh, N.T.T.; Balogun, A.L.; Wang, X.J. Deep learning and boosting framework for piping erosion susceptibility modeling: Spatial evaluation of agricultural areas in the semi-arid region. Geocarto Int. 2021. [Google Scholar] [CrossRef]
  29. Chowdhuri, I.; Pal, S.C.; Arabameri, A.; Saha, A.; Chakrabortty, R.; Blaschke, T.; Pradhan, B.; Band, S.S. Implementation of Artificial Intelligence Based Ensemble Models for Gully Erosion Susceptibility Assessment. Remote Sens. 2020, 12, 3620. [Google Scholar] [CrossRef]
  30. Lee, J.; Lee, S.; Hong, J.; Lee, D.; Bae, J.H.; Yang, J.E.; Kim, J.; Lim, K.J. Evaluation of Rainfall Erosivity Factor Estimation Using Machine and Deep Learning Models. Water 2021, 13, 382. [Google Scholar] [CrossRef]
  31. Pal, S.C.; Arabameri, A.; Blaschke, T.; Chowdhuri, I.; Saha, A.; Chakrabortty, R.; Lee, S.; Band, S.S. Ensemble of Machine-Learning Methods for Predicting Gully Erosion Susceptibility. Remote Sens. 2020, 12, 3675. [Google Scholar] [CrossRef]
  32. Hearst, M.A.; Dumais, S.T.; Osuna, E.; Platt, J.; Scholkopf, B. Support vector machines. IEEE Intell. Syst. Appl. 1998, 13, 18–28. [Google Scholar] [CrossRef] [Green Version]
  33. Peterson, L.E. K-nearest neighbor. Scholarpedia 2009, 4, 1883. [Google Scholar] [CrossRef]
  34. Shi, T.; Horvath, S. Unsupervised learning with random forest predictors. J. Comput. Graph. Stat. 2006, 15, 118–138. [Google Scholar] [CrossRef]
  35. Friedman, J.H. Stochastic gradient boosting. Comput. Stat. Data Anal. 2002, 38, 367–378. [Google Scholar] [CrossRef]
  36. Kumari, P.; Toshniwal, D. Extreme gradient boosting and deep neural network based ensemble learning approach to forecast hourly solar irradiance. J. Clean Prod. 2021, 279, 123285. [Google Scholar] [CrossRef]
  37. Huang, G.; Zhou, H.; Ding, X.; Zhang, R. Extreme Learning Machine for Regression and Multiclass Classification. IEEE Trans. Syst. Man Cybern. Part B 2012, 42, 513–529. [Google Scholar] [CrossRef] [Green Version]
  38. Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K. Extreme learning machine: Theory and applications. Neurocomputing 2006, 70, 489–501. [Google Scholar] [CrossRef]
  39. Kanimozhi, N.; Singaravel, G. Hybrid artificial fish particle swarm optimizer and kernel extreme learning machine for type-II diabetes predictive model. Med. Biol. Eng. Comput. 2021, 59, 841–867. [Google Scholar] [CrossRef]
  40. Cai, Q.; Li, F.H.; Chen, Y.F.; Li, H.S.; Cao, J.; Li, S.S. Label Rectification Learning through Kernel Extreme Learning Machine. Wirel. Commun. Mob. Comput. 2021, 2021. [Google Scholar] [CrossRef]
  41. Zhang, Y.; Liu, R.; Heidari, A.A.; Wang, X.; Chen, Y.; Wang, M.; Chen, H. Towards augmented kernel extreme learning models for bankruptcy prediction: Algorithmic behavior and comprehensive analysis. Neurocomputing 2021, 430, 185–212. [Google Scholar] [CrossRef]
  42. Yu, H.L.; Yuan, K.; Li, W.S.; Zhao, N.N.; Chen, W.B.; Huang, C.C.; Chen, H.L.; Wang, M.J. Improved Butterfly Optimizer-Configured Extreme Learning Machine for Fault Diagnosis. Complexity 2021, 2021. [Google Scholar] [CrossRef]
  43. Shan, W.F.; Qiao, Z.L.; Heidari, A.A.; Chen, H.L.; Turabieh, H.; Teng, Y.T. Double adaptive weights for stabilization of moth flame optimizer: Balance analysis, engineering cases, and medical diagnosis. Knowl.-Based Syst. 2021, 214, 106728. [Google Scholar] [CrossRef]
  44. Hu, J.; Chen, H.; Heidari, A.A.; Wang, M.; Zhang, X.; Chen, Y.; Pan, Z. Orthogonal learning covariance matrix for defects of grey wolf optimizer: Insights, balance, diversity, and feature selection. Knowl.-Based Syst. 2021, 213, 106684. [Google Scholar] [CrossRef]
  45. Li, Q.; Chen, H.; Huang, H.; Zhao, X.; Cai, Z.; Tong, C.; Liu, W.; Tian, X. An enhanced grey wolf optimization based feature selection wrapped kernel extreme learning machine for medical diagnosis. Comput. Math. Methods Med. 2017, 2017. [Google Scholar] [CrossRef]
  46. Hu, L.; Li, H.; Cai, Z.; Lin, F.; Hong, G.; Chen, H.; Lu, Z. A new machine-learning method to prognosticate paraquat poisoned patients by combining coagulation, liver, and kidney indices. PLoS ONE 2017, 12, e0186427. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  47. Zhao, X.; Zhang, X.; Cai, Z.; Tian, X.; Wang, X.; Huang, Y.; Chen, H.; Hu, L. Chaos enhanced grey wolf optimization wrapped ELM for diagnosis of paraquat-poisoned patients. Comput. Biol. Chem. 2019, 78, 481–490. [Google Scholar] [CrossRef]
  48. Cai, Z.; Gu, J.; Luo, J.; Zhang, Q.; Chen, H.; Pan, Z.; Li, Y.; Li, C. Evolving an optimal kernel extreme learning machine by using an enhanced grey wolf optimization strategy. Expert Syst. Appl. 2019, 138, 112814. [Google Scholar] [CrossRef]
  49. Wei, Y.; Ni, N.; Liu, D.; Chen, H.; Wang, M.; Li, Q.; Cui, X.; Ye, H. An improved grey wolf optimization strategy enhanced SVM and its application in predicting the second major. Math. Probl. Eng. 2017, 2017. [Google Scholar] [CrossRef] [Green Version]
  50. Ridha, H.M.; Gomes, C.; Hizam, H.; Ahmadipour, M.; Heidari, A.A.; Chen, H. Multi-objective optimization and multi-criteria decision-making methods for optimal design of standalone photovoltaic system: A comprehensive review. Renew. Sustain. Energy Rev. 2021, 135, 110202. [Google Scholar] [CrossRef]
  51. Song, S.; Wang, P.; Heidari, A.A.; Wang, M.; Zhao, X.; Chen, H.; He, W.; Xu, S. Dimension decided Harris hawks optimization with Gaussian mutation: Balance analysis and diversity patterns. Knowl.-Based Syst. 2020, 25, 106425. [Google Scholar] [CrossRef]
  52. Fan, Y.; Wang, P.; Heidari, A.A.; Wang, M.; Zhao, X.; Chen, H.; Li, C. Rationalized Fruit Fly Optimization with Sine Cosine Algorithm: A Comprehensive Analysis. Expert Syst. Appl. 2020, 157, 113486. [Google Scholar] [CrossRef]
  53. Tang, H.; Xu, Y.; Lin, A.; Heidari, A.A.; Wang, M.; Chen, H.; Luo, Y.; Li, C. Predicting Green Consumption Behaviors of Students Using Efficient Firefly Grey Wolf-Assisted K-Nearest Neighbor Classifiers. IEEE Access 2020, 8, 35546–35562. [Google Scholar] [CrossRef]
  54. Zhao, D.; Liu, L.; Yu, F.; Heidari, A.A.; Wang, M.; Oliva, D.; Muhammad, K.; Chen, H. Ant Colony Optimization with Horizontal and Vertical Crossover Search: Fundamental Visions for Multi-threshold Image Segmentation. Expert Syst. Appl. 2020, 167, 114122. [Google Scholar] [CrossRef]
  55. Liu, Y.; Chong, G.; Heidari, A.A.; Chen, H.; Liang, G.; Ye, X.; Cai, Z.; Wang, M. Horizontal and vertical crossover of Harris hawk optimizer with Nelder-Mead simplex for parameter estimation of photovoltaic models. Energy Convers. Manag. 2020, 223, 113211. [Google Scholar] [CrossRef]
  56. Wang, X.; Chen, H.; Heidari, A.A.; Zhang, X.; Xu, J.; Xu, Y.; Huang, H. Multi-population following behavior-driven fruit fly optimization: A Markov chain convergence proof and comprehensive analysis. Knowl.-Based Syst. 2020, 210, 106437. [Google Scholar] [CrossRef]
  57. Zhang, H.; Li, R.; Cai, Z.; Gu, Z.; Heidari, A.A.; Wang, M.; Chen, H.; Chen, M. Advanced Orthogonal Moth Flame Optimization with Broyden–Fletcher–Goldfarb–Shanno Algorithm: Framework and Real-world Problems. Expert Syst. Appl. 2020, 159, 113617. [Google Scholar] [CrossRef]
  58. Zhang, H.; Wang, Z.; Chen, W.; Heidari, A.A.; Wang, M.; Zhao, X.; Liang, G.; Chen, H.; Zhang, X. Ensemble mutation-driven salp swarm algorithm with restart mechanism: Framework and fundamental analysis. Expert Syst. Appl. 2021, 165, 113897. [Google Scholar] [CrossRef]
  59. Chantar, H.; Mafarja, M.; Alsawalqah, H.; Heidari, A.A.; Aljarah, I.; Faris, H. Feature selection using binary grey wolf optimizer with elite-based crossover for Arabic text classification. Neural Comput. Appl. 2020, 32, 12201–12220. [Google Scholar] [CrossRef]
  60. Liang, X.; Cai, Z.; Wang, M.; Zhao, X.; Chen, H.; Li, C. Chaotic oppositional sine–cosine method for solving global optimization problems. Eng. Comput. 2020, 1–17. [Google Scholar] [CrossRef]
  61. Zhu, W.; Ma, C.; Zhao, X.; Wang, M.; Heidari, A.A.; Chen, H.; Li, C. Evaluation of sino foreign cooperative education project using orthogonal sine cosine optimized kernel extreme learning machine. IEEE Access 2020, 8, 61107–61123. [Google Scholar] [CrossRef]
  62. Lin, A.; Wu, Q.; Heidari, A.A.; Xu, Y.; Chen, H.; Geng, W.; Li, C. Predicting intentions of students for master programs using a chaos-induced sine cosine-based fuzzy K-nearest neighbor classifier. IEEE Access 2019, 7, 67235–67248. [Google Scholar] [CrossRef]
  63. Tu, J.; Lin, A.; Chen, H.; Li, Y.; Li, C. Predict the entrepreneurial intention of fresh graduate students based on an adaptive support vector machine framework. Math. Probl. Eng. 2019, 2019. [Google Scholar] [CrossRef] [Green Version]
  64. Gupta, S.; Deep, K.; Heidari, A.A.; Moayedi, H.; Chen, H. Harmonized salp chain-built optimization. Eng. Comput. 2019, 37, 1–31. [Google Scholar] [CrossRef]
  65. Zhang, H.; Cai, Z.; Ye, X.; Wang, M.; Kuang, F.; Chen, H.; Li, C.; Li, Y. A multi-strategy enhanced salp swarm algorithm for global optimization. Eng. Comput. 2020, 1–27. [Google Scholar] [CrossRef]
  66. Zhang, L.; Liu, L.; Yang, X.S.; Dai, Y. A Novel Hybrid Firefly Algorithm for Global Optimization. PLoS ONE 2016, 11, e0163230. [Google Scholar] [CrossRef] [Green Version]
  67. Ahmadianfar, I.; Heidari, A.A.; Gandomi, A.H.; Chu, X.; Chen, H. RUN beyond the metaphor: An efficient optimization algorithm based on Runge Kutta method. Expert Syst. Appl. 2021, 181, 115079. [Google Scholar] [CrossRef]
  68. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  69. Mirjalili, S.; Dong, J.S.; Lewis, A. Nature-Inspired Optimizers: Theories, Literature Reviews and Applications; Springer: Berlin/Heidelberg, Germany, 2019; Volume 811. [Google Scholar]
  70. Wang, M.; Chen, H. Chaotic multi-swarm whale optimizer boosted support vector machine for medical diagnosis. Appl. Soft Comput. J. 2020, 88, 105946. [Google Scholar] [CrossRef]
  71. Yang, X.-S. A New Metaheuristic Bat-Inspired Algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); González, J.R., Pelta, D.A., Cruz, C., Terrazas, G., Krasnogor, N., Eds.; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
  72. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  73. Zhang, X.; Xu, Y.; Yu, C.; Heidari, A.A.; Li, S.; Chen, H.; Li, C. Gaussian mutational chaotic fruit fly-built optimization and feature selection. Expert Syst. Appl. 2020, 141, 112976. [Google Scholar] [CrossRef]
  74. Huang, H.; Zhou, S.; Jiang, J.; Chen, H.; Li, Y.; Li, C. A new fruit fly optimization algorithm enhanced support vector machine for diagnosis of breast cancer based on high-level features. BMC Bioinform. 2019, 20, 1–14. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  75. Chen, H.; Li, S.; Heidari, A.A.; Wang, P.; Li, J.; Yang, Y.; Huang, C. Efficient multi-population outpost fruit fly-driven optimizers: Framework and advances in support vector machines. Expert Syst. Appl. 2020, 142, 112999. [Google Scholar] [CrossRef]
  76. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  77. Yang, Y.; Chen, H.; Heidari, A.A.; Gandomi, A.H. Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Expert Syst. Appl. 2021, 177, 114864. [Google Scholar] [CrossRef]
  78. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  79. Yang, Q.; Chen, W.N.; Yu, Z.; Gu, T.; Li, Y.; Zhang, H.; Zhang, J. Adaptive multimodal continuous ant colony optimization. IEEE Trans. Evol. Comput. 2016, 21, 191–205. [Google Scholar] [CrossRef] [Green Version]
  80. Simone, M.; Fanti, A.; Lodi, M.B.; Pisanu, T.; Mazzarella, G. An In-Line Coaxial-to-Waveguide Transition for Q-Band Single-Feed-Per-Beam Antenna Systems. Appl. Sci. 2021, 11, 2524. [Google Scholar] [CrossRef]
  81. Simone, M.; Fanti, A.; Valente, G.; Montisci, G.; Ghiani, R.; Mazzarella, G. A Compact In-Line Waveguide-to-Microstrip Transition in the Q-Band for Radio Astronomy Applications. Electronics 2018, 7, 24. [Google Scholar] [CrossRef] [Green Version]
  82. Pang, J.; Zhou, H.; Tsai, Y.-C.; Chou, F.-D. A scatter simulated annealing algorithm for the bi-objective scheduling problem for the wet station of semiconductor manufacturing. Comput. Ind. Eng. 2018, 123, 54–66. [Google Scholar] [CrossRef]
  83. Zeng, G.-Q.; Lu, Y.-Z.; Mao, W.-J. Modified extremal optimization for the hard maximum satisfiability problem. J. Zhejiang Univ. Sci. C 2011, 12, 589–596. [Google Scholar] [CrossRef]
  84. Yu, C.; Chen, M.; Cheng, K.; Zhao, X.; Ma, C.; Kuang, F.; Chen, H. SGOA: Annealing-behaved grasshopper optimizer for global tasks. Eng. Comput. 2021, 1–28. [Google Scholar] [CrossRef]
  85. Niu, P.; Niu, S.; Chang, L. The defect of the Grey Wolf optimization algorithm and its verification method. Knowl.-Based Syst. 2019, 171, 37–43. [Google Scholar] [CrossRef]
  86. Villalón, C.L.C.; Stützle, T.; Dorigo, M. Grey Wolf, Firefly and Bat Algorithms: Three Widespread Algorithms that Do Not Contain Any Novelty. In International Conference on Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 2020; pp. 121–133. [Google Scholar]
  87. Chen, C.C.; Wang, X.C.; Yu, H.L.; Zhao, N.N.; Wang, M.J.; Chen, H.L. An Enhanced Comprehensive Learning Particle Swarm Optimizer with the Elite-Based Dominance Scheme. Complexity 2020, 2020. [Google Scholar] [CrossRef]
  88. Zhao, D.; Liu, L.; Yu, F.; Heidari, A.A.; Wang, M.; Liang, G.; Muhammad, K.; Chen, H. Chaotic random spare ant colony optimization for multi-threshold image segmentation of 2D Kapur entropy. Knowl.-Based Syst. 2021, 216, 106510. [Google Scholar] [CrossRef]
  89. Heidari, A.A.; Aljarah, I.; Faris, H.; Chen, H.; Luo, J.; Mirjalili, S. An enhanced associative learning-based exploratory whale optimizer for global optimization. Neural Comput. Appl. 2019, 32, 1–27. [Google Scholar] [CrossRef]
  90. Zhu, A.; Xu, C.; Li, Z.; Wu, J.; Liu, Z. Hybridizing grey wolf optimization with differential evolution for global optimization and test scheduling for 3D stacked SoC. J. Syst. Eng. Electron. 2015, 26, 317–328. [Google Scholar] [CrossRef]
  91. Heidari, A.A.; Ali Abbaspour, R.; Chen, H. Efficient boosted grey wolf optimizers for global search and kernel extreme learning machine training. Appl. Soft Comput. 2019, 81, 105521. [Google Scholar] [CrossRef]
  92. Li, C.; Hou, L.; Sharma, B.Y.; Li, H.; Chen, C.; Li, Y.; Zhao, X.; Huang, H.; Cai, Z.; Chen, H. Developing a new intelligent system for the diagnosis of tuberculous pleural effusion. Comput. Methods Programs Biomed. 2018, 153, 211–225. [Google Scholar] [CrossRef] [PubMed]
  93. Wang, M.; Chen, H.; Yang, B.; Zhao, X.; Hu, L.; Cai, Z.; Huang, H.; Tong, C. Toward an optimal kernel extreme learning machine using a chaotic moth-flame optimization strategy with applications in medical diagnoses. Neurocomputing 2017, 267, 69–84. [Google Scholar] [CrossRef]
  94. Chen, C.; Wang, X.; Yu, H.; Wang, M.; Chen, H. Dealing with multi-modality using synthesis of Moth-flame optimizer with sine cosine mechanisms. Math. Comput. Simul. 2021, 188, 291–318. [Google Scholar] [CrossRef]
  95. Tuan, V.D.; Hilger, T.; MacDonald, L.; Clemens, G.; Shiraishi, E.; Vien, T.D.; Stahr, K.; Cadisch, G. Mitigation potential of soil conservation in maize cropping on steep slopes. Field Crop. Res. 2014, 156, 91–102. [Google Scholar] [CrossRef]
  96. Valentin, C.; Agus, F.; Alamban, R.; Boosaner, A.; Bricquet, J.P.; Chaplot, V.; de Guzman, T.; de Rouw, A.; Janeau, J.L.; Orange, D.; et al. Runoff and sediment losses from 27 upland catchments in Southeast Asia: Impact of rapid land use changes and conservation practices. Agric. Ecosyst. Environ. 2008, 128, 225–238. [Google Scholar] [CrossRef]
  97. Yu, C.; Heidari, A.A.; Chen, H. A Quantum-behaved Simulated Annealing Enhanced Moth-flame Optimization Method. Appl. Math. Model. 2020, 87, 1–19. [Google Scholar] [CrossRef]
  98. Heidari, A.A.; Yin, Y.; Mafarja, M.; Jalali, S.M.J.; Dong, J.S.; Mirjalili, S. Efficient Moth-Flame-Based Neuroevolution Models. In Evolutionary Machine Learning Techniques; Springer: Berlin/Heidelberg, Germany, 2020; pp. 51–66. [Google Scholar]
  99. Xu, Y.; Chen, H.; Heidari, A.A.; Luo, J.; Zhang, Q.; Zhao, X.; Li, C. An efficient chaotic mutative moth-flame-inspired optimizer for global optimization tasks. Expert Syst. Appl. 2019, 129, 135–155. [Google Scholar] [CrossRef]
  100. Zhang, H.; Heidari, A.A.; Wang, M.; Zhang, L.; Chen, H.; Li, C. Orthogonal Nelder-Mead moth flame method for parameters identification of photovoltaic modules. Energy Convers. Manag. 2020, 211, 112764. [Google Scholar] [CrossRef]
  101. Chen, H.; Heidari, A.A.; Zhao, X.; Zhang, L.; Chen, H. Advanced orthogonal learning-driven multi-swarm sine cosine optimization: Framework and case studies. Expert Syst. Appl. 2020, 144, 113113. [Google Scholar] [CrossRef]
  102. Chen, H.; Jiao, S.; Heidari, A.A.; Wang, M.; Chen, X.; Zhao, X. An opposition-based sine cosine approach with local search for parameter estimation of photovoltaic models. Energy Convers. Manag. 2019, 195, 927–942. [Google Scholar] [CrossRef]
  103. Liu, G.; Jia, W.; Wang, M.; Heidari, A.A.; Chen, H.; Luo, Y.; Li, C. Predicting Cervical Hyperextension Injury: A Covariance Guided Sine Cosine Support Vector Machine. IEEE Access 2020, 8, 46895–46908. [Google Scholar] [CrossRef]
  104. Thaher, T.; Heidari, A.A.; Mafarja, M.; Dong, J.S.; Mirjalili, S. Binary Harris Hawks optimizer for high-dimensional, low sample size feature selection. In Evolutionary Machine Learning Techniques; Springer: Berlin/Heidelberg, Germany, 2020; pp. 251–272. [Google Scholar]
  105. Mirjalili, S.; Aljarah, I.; Mafarja, M.; Heidari, A.A.; Faris, H. Grey Wolf Optimizer: Theory, Literature Review and Application in Computational Fluid Dynamics Problems. In Nature-Inspired Optimizers: Theories, Literature Reviews and Applications; Mirjalili, S., Song Dong, J., Lewis, A., Eds.; Springer International Publishing: Cham, Switzerland, 2020. [Google Scholar]
  106. Chicco, D.; Jurman, G. The advantages of the Matthews correlation coefficient (MCC) over F1 score and accuracy in binary classification evaluation. BMC Genom. 2020, 21, 1–13. [Google Scholar] [CrossRef] [Green Version]
  107. Setiawan, A.W. Image Segmentation Metrics in Skin Lesion: Accuracy, Sensitivity, Specificity, Dice Coefficient, Jaccard Index, and Matthews Correlation Coefficient. In Proceedings of the 2020 International Conference on Computer Engineering, Network, and Intelligent Multimedia (CENIM), Surabaya, Indonesia, 17–18 November 2020; pp. 97–102. [Google Scholar]
  108. Li, J.; Chen, C.; Chen, H.; Tong, C. Towards Context-aware Social Recommendation via Individual Trust. Knowl.-Based Syst. 2017, 127, 58–66. [Google Scholar] [CrossRef]
  109. Li, J.; Lin, J. A probability distribution detection based hybrid ensemble QoS prediction approach. Inf. Sci. 2020, 519, 289–305. [Google Scholar] [CrossRef]
  110. Li, J.; Zheng, X.-L.; Chen, S.-T.; Song, W.-W.; Chen, D.-r. An efficient and reliable approach for quality-of-service-aware service composition. Inf. Sci. 2014, 269, 238–254. [Google Scholar] [CrossRef]
  111. Cao, X.; Cao, T.; Gao, F.; Guan, X. Risk-Averse Storage Planning for Improving RES Hosting Capacity under Uncertain Siting Choice. IEEE Trans. Sustain. Energy 2021. [Google Scholar] [CrossRef]
  112. Zhao, H.; Guo, H.; Jin, X.; Shen, J.; Mao, X.; Liu, J.J.N. Parallel and efficient approximate nearest patch matching for image editing applications. Neurocomputing 2018, 305, 39–50. [Google Scholar] [CrossRef]
  113. Zhao, Y.; Jin, X.; Xu, Y.; Zhao, H.; Ai, M.; Zhou, K. Parallel style-aware image cloning for artworks. IEEE Trans. Vis. Comput. Graph. 2014, 21, 229–240. [Google Scholar] [CrossRef] [PubMed]
  114. Yang, Y.; Zhao, H.; You, L.; Tu, R.; Wu, X.; Jin, X. Semantic portrait color transfer with internet images. Multimed. Tools Appl. 2017, 76, 523–541. [Google Scholar] [CrossRef]
  115. Xue, X.; Wang, S.F.; Zhan, L.J.; Feng, Z.Y.; Guo, Y.D. Social Learning Evolution (SLE): Computational Experiment-Based Modeling Framework of Social Manufacturing. IEEE Trans. Ind. Inform. 2019, 15, 3343–3355. [Google Scholar] [CrossRef]
  116. Xue, X.; Chen, Z.; Wang, S.; Feng, Z.; Duan, Y.; Zhou, Z. Value Entropy: A Systematic Evaluation Model of Service Ecosystem Evolution. IEEE Trans. Services Comput. 2020. [Google Scholar] [CrossRef]
  117. Chen, H.; Yang, B.; Pei, H.; Liu, J. Next generation technology for epidemic prevention and control: Data-driven contact tracking. IEEE Access 2018, 7, 2633–2642. [Google Scholar] [CrossRef]
  118. Chen, H.; Yang, B.; Liu, J.; Zhou, X.-N.; Philip, S.Y. Mining spatiotemporal diffusion network: A new framework of active surveillance planning. IEEE Access 2019, 7, 108458–108473. [Google Scholar] [CrossRef]
  119. Pei, H.; Yang, B.; Liu, J.; Chang, K. Active Surveillance via Group Sparse Bayesian Learning. IEEE Trans. Pattern Anal. Mach. Intell. 2020. [Google Scholar] [CrossRef] [PubMed]
  120. Liu, X.; Yang, B.; Chen, H.; Musial, K.; Chen, H.; Li, Y.; Zuo, W. A Scalable Redefined Stochastic Blockmodel. ACM Trans. Knowl. Discov. Data 2021, 15, 1–28. [Google Scholar]
  121. Qiu, S.; Wang, Z.; Zhao, H.; Qin, K.; Li, Z.; Hu, H. Inertial/magnetic sensors based pedestrian dead reckoning by means of multi-sensor fusion. Inf. Fusion 2018, 39, 108–119. [Google Scholar] [CrossRef] [Green Version]
  122. Qiu, S.; Wang, Z.; Zhao, H.; Hu, H. Using distributed wearable sensors to measure and evaluate human lower limb motions. IEEE Trans. Instrum. Meas. 2016, 65, 939–950. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The framework of this paper.
Figure 1. The framework of this paper.
Electronics 10 02115 g001
Figure 2. The structure diagram of the KELM model.
Figure 2. The structure diagram of the KELM model.
Electronics 10 02115 g002
Figure 3. The flight path of moths.
Figure 3. The flight path of moths.
Electronics 10 02115 g003
Figure 4. The basic principle of sine–cosine strategy.
Figure 4. The basic principle of sine–cosine strategy.
Electronics 10 02115 g004
Figure 5. Soil erosion prediction model flowchart based on SMFO-KELM.
Figure 5. Soil erosion prediction model flowchart based on SMFO-KELM.
Electronics 10 02115 g005
Figure 6. Boxplot of SMFO-KELM and its classical primitive algorithm competitors of ACC, MCC, sensitivity, and specificity on Son La dataset.
Figure 6. Boxplot of SMFO-KELM and its classical primitive algorithm competitors of ACC, MCC, sensitivity, and specificity on Son La dataset.
Electronics 10 02115 g006
Figure 7. Boxplot of SMFO-KELM and its advanced algorithm competitors of ACC, MCC, sensitivity, and specificity on Son La dataset.
Figure 7. Boxplot of SMFO-KELM and its advanced algorithm competitors of ACC, MCC, sensitivity, and specificity on Son La dataset.
Electronics 10 02115 g007
Figure 8. Bar of SMFO-KELM and other competitors in terms of ACC, MCC, sensitivity, and specificity on the dataset.
Figure 8. Bar of SMFO-KELM and other competitors in terms of ACC, MCC, sensitivity, and specificity on the dataset.
Electronics 10 02115 g008
Table 1. Influencing factors of soil erosion.
Table 1. Influencing factors of soil erosion.
FactorsUnitVariablesMinMaxMeanStd.
EI30%X10.003008.93134.77385.10
Slope degree%X224.8334.7728.852.39
OC topsoil%X30.892.791.860.56
pH topsoil%X45.137.065.750.56
Bulk densityg/cm3X51.231.581.390.08
Topsoil porosity%X646.3459.4852.363.15
Topsoil texture (silt fraction)%X731.3537.7133.821.48
Topsoil texture (clay fraction)%X818.6138.3530.034.67
Topsoil texture (sand fraction)%X929.6646.5136.154.12
Soil cover rate%X101.0599.7153.5125.36
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chen, C.; Wang, X.; Wu, C.; Mafarja, M.; Turabieh, H.; Chen, H. Soil Erosion Prediction Based on Moth-Flame Optimizer-Evolved Kernel Extreme Learning Machine. Electronics 2021, 10, 2115. https://doi.org/10.3390/electronics10172115

AMA Style

Chen C, Wang X, Wu C, Mafarja M, Turabieh H, Chen H. Soil Erosion Prediction Based on Moth-Flame Optimizer-Evolved Kernel Extreme Learning Machine. Electronics. 2021; 10(17):2115. https://doi.org/10.3390/electronics10172115

Chicago/Turabian Style

Chen, Chengcheng, Xianchang Wang, Chengwen Wu, Majdi Mafarja, Hamza Turabieh, and Huiling Chen. 2021. "Soil Erosion Prediction Based on Moth-Flame Optimizer-Evolved Kernel Extreme Learning Machine" Electronics 10, no. 17: 2115. https://doi.org/10.3390/electronics10172115

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop