Next Article in Journal
A Quantum Computing-Based Accelerated Model for Image Classification Using a Parallel Pipeline Encoded Inception Module
Next Article in Special Issue
A Survey on Fair Allocation of Chores
Previous Article in Journal
An Improved Second-Order Sliding Mode Control for an Interception Guidance System without Angular Velocity Measurement
Previous Article in Special Issue
The Moving Firefighter Problem
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design of Nonlinear Marine Predator Heuristics for Hammerstein Autoregressive Exogenous System Identification with Key-Term Separation

by
Khizer Mehmood
1,
Naveed Ishtiaq Chaudhary
2,*,
Khalid Mehmood Cheema
3,
Zeshan Aslam Khan
1,
Muhammad Asif Zahoor Raja
2,
Ahmad H. Milyani
4 and
Abdulellah Alsulami
4
1
Department of Electrical & Computer Engineering, International Islamic University, Islamabad 44000, Pakistan
2
Future Technology Research Center, National Yunlin University of Science and Technology, 123 University Road, Section 3, Yunlin, Douliou 64002, Taiwan
3
Department of Electronic Engineering, Fatima Jinnah Women University, Rawalpindi 46000, Pakistan
4
Department of Electrical & Computer Engineering, King Abdulaziz University, Jeddah 21589, Saudi Arabia
*
Author to whom correspondence should be addressed.
Mathematics 2023, 11(11), 2512; https://doi.org/10.3390/math11112512
Submission received: 14 April 2023 / Revised: 13 May 2023 / Accepted: 21 May 2023 / Published: 30 May 2023
(This article belongs to the Special Issue Optimisation Algorithms and Their Applications)

Abstract

:
Swarm-based metaheuristics have shown significant progress in solving different complex optimization problems, including the parameter identification of linear, as well as nonlinear, systems. Nonlinear systems are inherently stiff and difficult to optimize and, thus, require special attention to effectively estimate their parameters. This study investigates the parameter identification of an input nonlinear autoregressive exogenous (IN-ARX) model through swarm intelligence knacks of the nonlinear marine predators’ algorithm (NMPA). A detailed comparative analysis of the NMPA with other recently introduced metaheuristics, such as Aquila optimizer, prairie dog optimization, reptile search algorithm, sine cosine algorithm, and whale optimization algorithm, established the superiority of the proposed scheme in terms of accurate, robust, and convergent performances for different noise and generation variations. The statistics generated through multiple autonomous executions represent box and whisker plots, along with the Wilcoxon rank-sum test, further confirming the reliability and stability of the NMPA for parameter estimation of IN-ARX systems.

1. Introduction

Parameter estimation is widely used in various application areas, such as signal processing [1], machine learning [2], nonlinear systems [3], power [4], control [5], energy [6], system identification, energy, etc. [7]. An input nonlinear autoregressive exogenous (IN-ARX) model has been applied in different disciplines, such as multi-input multi-output (MIMO) systems [8], muscle modeling [9], data filtering [10], wireless sensor network [11], control systems [12], and electromyography [13].
Metaheuristics are applied in different engineering domains, since they treat optimization problems as a black box. Metaheuristics can be classified as evolutionary methods, human-based methods, physics-based methods, and swarm-based methods, as shown in Figure 1.
Evolutionary methods are inspired by biological processes, including reproduction, mutation, and selection in optimization. In differential evolution (DE) [14], the optimization is done by updating the solution based on mutation, crossover, and selection processes. In the genetic algorithm (GA) [15], the optimization is performed by selecting the fittest individuals for reproduction and producing offspring for the next generation. In the evolutionary mating algorithm [16], the optimization is performed by using the Hardy-Weinberg principle, crossover index, and environmental factors for solution updates in each generation. In the evolutionary algorithm [17], the optimization is done by using a parent-centric recombination operator, along with a fast population alteration model for solution updates.
Human-based methods were inspired by human problem-solving abilities for optimization. In teaching-learning optimization [18], the fitness function is optimized by stimulating a classroom environment. The optimization is divided into the teacher phase and learner phase, during which students learn from the teacher and then interact among themselves for knowledge sharing. In a heap-based optimizer [19], the optimization is performed by using corporate rank hierarchy, which depends on the interaction between subordinates and their boss, the interaction between colleagues, and the self-contribution of employees for each generation of optimization. In the human urbanization algorithm [20], the optimization is done by using multiple strategies, such as a combined search in open and limited scopes, along with the population management agent’s concentration during the search process. Forensic-based optimization [21] performs the optimization by using multiple processes used by police officers, such as investigation, location, and pursuit with fast convergence.
Physics-based methods were inspired by the laws of physics. In the gravitational search algorithm [22], optimization is done by interacting masses as search agents based on Newtonian gravity and the laws of motion. In big bang–big crunch [23], the optimization is performed in two phases, namely big bang and big crunch, respectively. In big bang, a random population is created around the center of masses in the search space, whereas, in big crunch, random particles were bought into order, creating only one mass so that both exploration and exploitation of the search space can be performed. In the circle search algorithm [24], the optimization is performed by using the geometric features of circles, such as the diameter, perimeter, and tangent lines. In the galaxy-based search algorithm [25], the optimization is done by using two modules, namely spiral chaotic move and local search. The spiral chaotic move performs the exploration, whereas the local search performs the exploitation.
Swarm-based methods were inspired by the behavior of swarms in nature. In particle swarm optimization (PSO) [26], the optimization is performed by constituting a swarm of particles around a search space for finding the local best and global best solutions. In the marine predator algorithm [27], the optimization is based on Levy and Brownian movements when encountering predator and prey. In a monkey search [28], the optimization is done by updating the solution on the basic food-searching process used by monkeys when climbing trees. With the ant lion optimizer [29], the optimization is performed by using hunting steps such as the random walk of ants, building traps, the entrapment of ants in traps, catching prey, and rebuilding traps.
Various techniques have been used in the literature for the identification of IN-ARX models. In [30], particle swarm optimization was applied for the identification of fractional nonlinear ARX models over several variations of generations and noise levels. In [31], a least mean support vector machine-based method was applied for the prediction of Hammerstein ARX through subspace-state space representation. In [32], a fractional hierarchical gradient descent method was applied for the identification of the nonlinear ARX model on several fractional order and noise conditions. In [33], genetic algorithms were applied for the identification of fractional Hammerstein ARX models in both noiseless and noisy conditions. In [34], fractional order particle swarm optimization, along with key term separation, were proposed. In [35], genetic algorithms and the key term separation principle were used for nonlinear Hammerstein ARX model identification. In [36], the marine predator algorithm was used for the identification of the nonlinear Hammerstein ARX model with key term separation.
Even though various metaheuristics were proposed and applied to different optimization problems, there is no single method that can solve all problems [37]. This leads researchers to develop new or improve existing metaheuristics for different real-world optimization problems [38]. Among the different approaches, swarm-based metaheuristics have been applied in various applications of parameter estimation, which are solar cells [39], electric vehicles [40], image processing [41], machine learning [42], multiple input multiple output systems [43], ARX estimation [44], economic dispatch [45], temperature processing plants [46], and nonlinear system identification [36].
The marine predator algorithm (MPA) was proposed in 2020 [27] and received an overwhelming response from the research community, with applications to solve a variety of optimization problems that arise in engineering, science, and technology, such as parameter estimation of photovoltaic modules [47], COVID-19 forecasting [48], heartbeats classification [49], optimal power allocation [50], human activity recognition [51], the control of hybrid power systems [52], etc. The nonlinear marine predator algorithm (NMPA) is an improved variant of the standard MPA proposed in 2022 [53] that effectively incorporates exploration and exploitation dynamics through an improved transition from a global search to a local search. The NMPA uses a set of nonlinear functions to change the search patterns of MPA. The superiority of the NMPA is well established in [53] over recent state-of-the-art counterparts, including the conventional MPA for a variety of benchmark functions, as well as real applications, such as fair power allocation in multiple access systems and Beyond 5G networks. Thus, accordingly, the research community has an opportunity to implement the stupendous knacks of the NMPA with expected outcomes better than the MPA and counterparts in stiff nonlinear optimization problems. These are the motivating factors for the authors to exploit the NMPA for the effective parameter estimation of nonlinear systems.
In the current study, the NMPA is investigated for the parameter estimation of IN-ARX models. The investigation is conducted on various noise levels, populations, and generations. Statistical comparisons of the NMPA with several methods were also conducted to establish robustness and reliability.
In Section 2, the mathematical model of IN-ARX is given. Section 3 provides the details of the NMPA scheme. The simulation results, along with the comparison of the NMPA with recent counterparts, are presented in Section 4, while the concluding remarks are given in Section 5.

2. Input Nonlinear Autoregressive Exogenous Model Definition

The input nonlinear autoregressive exogenous (IN-ARX) model is a block-oriented structure in which there are two blocks: the first block is nonlinear, followed by a linear subsystem. In IN-ARX, normally, the nonlinear block is described by a polynomial-type nonlinearity, and the linear subsystem is characterized by a simple ARX structure. The IN-ARX is used to model different engineering problems arising in communication, signal processing, control systems, and biomedical sciences [8,9,10,11,12,13]. Figure 1 shows the block diagram representation of the IN-ARX model, while its mathematical representation is given in Equation (1)
l ( t ) = D ( z ) F ( z ) m ¯ ( t ) + 1 F ( z ) n ( t ) ,
Simplify Equation (1) as given in Equation (2):
l ( t ) = ( 1 F ( z ) ) l ( t ) + D ( z ) m ¯ ( t ) + n ( t ) ,
where n ( t ) represents noise, and m ( t ) / l ( t ) shows the input/output. D ( z ) , F ( z ) , and m ¯ ( t ) are defined in Equations (3)–(5), respectively.
D ( z ) = d 0 + d 1 z 1 + d 2 z 2 + , , + d i d z i d
F ( z ) = 1 + f 1 z 1 + f 2 z 2 + , , + f i f z i f
m ¯ ( t ) = b 1 h 1 [ m ( t ) ] + b 2 h 2 [ m ( t ) ] + , , + b i h i [ m ( t ) ] .
Simplifying Equation (2) by incorporating Equations (3)–(5) and applying the key term separation technique by choosing m ¯ ( t ) as the key term with d 0 = 1 is given in Equation (6)
l ( t ) = k = 1 i f f k [ l ( t k ) ] + k = 0 i d d k m ¯ ( t k )     = k = 1 i f f k [ l ( t k ) ] + d 0 [ m ¯ ( t ) ] + k = 1 i d d k m ¯ ( t k ) + n ( t )    = k = 1 i f f k [ l ( t k ) ] + k = 1 i d d k [ m ¯ ( t k ) ] + k = 1 p b k h k [ m ( t ) ] + n ( t ) .
The information/parameter vectors are expressed in Equations (7)–(9) and Equations (10)–(12), respectively
γ f ( t ) = [ l ( t 1 ) , l ( t 2 ) , , l ( t i f ) ] T R i f ,
γ d ( t ) = [ m ¯ ( t 1 ) , m ¯ ( t 2 ) , , m ¯ ( t i d ) ] T R i d ,
h ( t ) = h 1 [ m ( t ) ] , h 2 [ m ( t ) ] , , h p [ m ( t ) ] T R p ,
f = [ f 1 , f 2 , , f i f ] T R i f ,
d = [ d 1 , d 2 , , d i f ] T R i d ,
b = [ b 1 , b 2 , , b p ] T R p .
Using Equations (7)–(12) in Equation (6) represents the key term separation identification approach for the IN-ARX model, as given in Equation (13)
l ( t ) = γ f T ( t ) f + γ d T ( t ) d + h T ( t ) b + n ( t ) .

3. Proposed Methodology

In this section, the parameter estimations of the IN-ARX model when using the nonlinear marine predator algorithm (NMPA) [47] are provided.

3.1. Nonlinear Marine Predator Algorithm

The NMPA is inspired by the performance of marine predators such as sharks, sunfish, and tunas to catch prey in oceans by using Brownian movements and Levy flight walk. Its mathematical steps are as follows.

3.1.1. Step 1: Initialization

In this step, the population is initialized over uniformly distributed search space, as given in Equation (14):
K i = K i min + rand × ( K i max K i min )
where K i max and K i min are the maximum and minimum boundary values, respectively.

3.1.2. Step 2: Detecting Top Predator

The populations are arranged by constructing the Prey matrix ( P m ) and Elite matrix ( E m ). These matrices consist of random positions during step 1 and position vectors with the best fitness, as given in Equations (15) and (16):
P m = ( K 1 , 1 K 1 , a K P n , 1 K P n , a ) P n × a
E m = ( K 1 , 1 I K 1 , a I K P n , 1 I K P n , a I ) P n × a
where K I represents the top predator vector.

3.1.3. Step 3: Brownian Movements and Levy Flight-Based Optimization

In this step, the optimization is done by considering different velocity ratios. It involves three phases, as given below.

Phase I

This phase involves an exploration at a high-velocity ratio by expanding the Brownian movement for one-third of the generations (Gn). The updated matrices are given in Equations (17) and (18):
S B j = R B ( E m j ( R B P m j ) ) , j = 1 , 2 , , P n
P m i = P m i + ( E . B S B j )
where R B is a normal distributed vector, and B is vector [ 0.1 ] .

Phase II

This phase involves both exploration and exploitation at a unit velocity ratio between 1/3 and 2/3 of the maximum generations (MaxGn). The updated matrices are given in Equations (19)–(22):
S B j = R L ( E m j ( R L P m j ) ) , j = 1 , 2 , , P n 2
P m i = * P m i + ( E . B S B j )
where = 2 * exp ( ( 6 * G n M a x G n ) 2 ) is a nonlinear parameter to balance the exploration and exploitation of the NMPA.
S B j = R B ( R B ( E m j P m j ) ) , j = P n 2 , , P n
P m i = * E m i + ( E . A F n e w S B j )
where A F n e w = a b s ( 2 * ( 1 G n M a x G n ) 2 ) is an adaptive factor for controlling predator movement step size increases linearly over interval [ 0.2 ] .

Phase III

This phase involves exploitation at a low-velocity ratio for the remaining generation (Gn). The updated matrices are given in Equations (23) and (24).
S B j = R L ( R L ( E m j P m j ) ) , j = 1 , 2 , , P n
P m i = E m i + ( E . A F n e w S B j )

3.1.4. Step 4: Fish Aggregating Device (FAD) Effects

In this step, FAD effects are incorporated, as presented in Equations (25) and (26).
P m j = P m j + A F n e w [ K min + B ( K max K min ) ] U , if s F A D s
P m j = P m i + [ F A D s ( 1 s ) + s ( P m s 1 P m s 2 ) , if s F A D s
where U is the binary vector, s is a random number [ 0 , 1 ] , and s 1 and s 2 are subscripts of P m .

3.1.5. Step 5: Marine Memory

In this step, based on the performances of steps 1–4, the updated solution is compared with the previous solution.
The flowchart is shown in Figure 2.
The pseudo-code is given in Algorithm 1
Algorithm 1: Pseudo-code of NMPA
Initialize Population ( P n ) by using (14).
while check termination criteria
Calculate Fitness value and construct matrices ( P m ) and ( E m ) by using (15) and (16).
if  G n < M a x G n 3
Update by using (17) and (18).
else if  M a x G n 3 < G n < 2 * M a x G n 3
Update by using Equations (19) and (20) for first half.
Update by using Equations (21) and (22) for other half.
else if  G n > 2 * M a x G n 3
Update by using Equations (23) and (24).
end
Update by using Equations (25) and (26).
Accomplish memory saving and update.
end

4. Performance Analysis

The simulation is performed using an Intel Core i5 system with 16 GB memory. The analysis is carried out on various generations (Gn), populations (Pn), and noise levels n(t) and is evaluated through fitness, calculated by Equation (27)
F i t n e s s = 1 K τ = 1 K [ l ( t τ ) l ˜ ( t τ ) ] 2 ,
where l / l ˜ are the desired/approximated responses, and the IN-ARX approximated output is calculated by Equation (28)
l ( t ) = γ f T ( t ) f ˜ + γ d T ( t ) d ˜ + h T ( t ) b ˜ + n ( t ) .
The estimated parameter vectors using the NMPA are given in Equations (29)–(31)
f ˜ = [ f ˜ 1 , f ˜ 2 , , f ˜ i f ] T R i f ,
d ˜ = [ d ˜ 1 , d ˜ 2 , , d ˜ i d ] T R i d ,
b ˜ = [ b ˜ 1 , b ˜ 2 , , b ˜ p ] T R p ,
The objective is to minimize Equation (27) by using the NMPA. The desired parameter vectors taken from the recently reported literature [54] are given in Equations (32)–(35)
f = [ f 1 , f 2 ] T = [ 0.8000 , 0.9000 ] T ,
d = [ d 1 , d 2 ] T = [ 1.1000 , 0.6800 ] T ,
b = [ b 1 , b 2 , b 3 ] T = [ 0.9000 , 1.3000 , 0.5000 ] T ,
τ = [ f 1 , f 2 , d 1 , d 2 , b 1 , b 2 , b 3 ] T = [ 0.8000 , 0.9000 , 1.1000 , 0.68000 , 0.9000 , 1.3000 , 0.5000 ] T .
The n(t) is the normal distributed noise with levels [0.0015, 0.015, 0.15]. The behavior of the NMPA is assessed based on the generations (Gn) [330, 660] and population (Pn = 45). The fitness curves are shown in Figure 3.
Figure 3a shows the convergence of the NMPA for all noise variances [0.0015, 0.015, 0.15] at Gn = 330, whereas Figure 3b shows convergence at Gn = 660. It can be observed from Figure 3a,b that, with the increasing Gn, the fitness decreases significantly. Moreover, at a low noise level [0.0015], the fitness of the NMPA is the minimum, whereas, for higher levels [0.015, 0.15], the fitness also increases.
The metaheuristics for the comparative study of the NMPA are selected after an exhaustive survey of the literature regarding the recently introduced metaheuristics. The NMPA is an improved variant of the conventional MPA, with established supremacy over various state-of-the-art counterparts, such as Differential Evolution (DE), Particle Swarm Optimization (PSO), Multi-Verse Optimization (MVO), Moth Flame Optimization (MFO), Grey Wolf Optimizer (GWO), Salp Swarm Algorithm (SSA), and the original marine predator algorithm (MPA) [53]. Therefore, in order to avoid redundancy and performing repeated investigations or experimentations, we selected ones other than these recently introduced metaheuristics. Thus, the performance of the NMPA is further investigated in comparison with other metaheuristics, which are Aquila optimizer (AO) [55], prairie dog optimization (PDO) [56], the reptile search algorithm (RSA) [57], sine cosine approach (SCA) [58], and whale optimization approach (WOA) [59], for the Gn = [330, 660] and noise levels [0.0015, 0.015, 0.15] for 30 independent runs.
AO is a population search-based metaheuristic motivated by Aquila’s strength to catch the prey. The optimization is divided into four steps. Two of them are used in exploration, while the remaining two are used in exploitation. It is applied in various optimization problems and applications of different domains. PDO is another population-based metaheuristic inspired by the natural habitat behavior of prairie dogs. The optimization involves exploration and exploitation based on foraging, borrowing, and communication. RSA is another population search-based heuristic motivated by hunting behavior of reptiles. The optimization is accomplished by using encircling and hunting during the exploration and exploitation phases. SCA is another population-based metaheuristic inspired by trigonometric functions used in mathematics. The exploration and exploitation of the search space are achieved by using adaptive random variables based on the sine and cosine functions. WOA is also a population-based metaheuristic inspired by the social behavior of hump whales. The exploration and exploitation of search spaces are performed by incorporating the bubble net hunting strategy in the optimization process.
Figure 4 shows the convergence curves of AO, PDO, RSA, and SCA for different scenarios of parameter tuning. The tuning is performed at Gn = 660, Pn = 45, and noise variance 0.0015 for ten independent runs. It is observed from Figure 4a that AO achieves the lowest fitness at alpha = 0.1 and delta = 0.1. Similarly, in Figure 4b–d, PDO, RSA, and SCA achieve the lowest fitness at rho = 0.005, alpha = 0.1, beta = 0.1, and a = 2, respectively.
The parameter settings of these metaheuristics are summarized in Table 1.
Table 2, Table 3 and Table 4 display the presentation of all metaheuristics for the best fitness and the estimated parameters corresponding to the best fitness for the noise = [0.0015, 0.015, 0.15] scenarios. It is also seen that, for the low-noise scenario, i.e., 0.0015, the results of the NMPA are better when compared to relatively high-noise scenarios. Moreover, at the 0.0015 noise level, the estimated weights are more accurate. It is also distinguishable from Table 2, Table 3 and Table 4 that the best values of fitness for the 0.0015, 0.015, and 0.15 noise scenarios are 4.7 × 10 6 , 4.6 × 10 4 , and 0.0405, respectively. Thus, Table 2, Table 3 and Table 4 confirm that the NMPA fitness degrades a little bit for high-noise scenarios and improves by increasing generations.
Figure 5, Figure 6 and Figure 7 validate the convergence of the NMPA with AO, PDO, RSA SCA, and WOA for all noise levels. Figure 5a,b show the convergence for Gn = [330,660] for 0.0015 noise, whereas Figure 6a,b show convergence for 0.015 noise, and Figure 7a,b show convergence for the 0.15 noise level. It is distinguishable from Figure 5, Figure 6 and Figure 7 that, upon an increase in the noise levels, the fitness rises. Still, the NMPA attains the lowermost fitness compared to AO, PDO RSA, SCA, and WOA for all variations.
The statistical investigation of the NMPA against AO, PDO, RSA, SCA, and WOA at Pn = 45 and Gn = 660 for 30 autonomous executions is shown in Figure 8. It is distinguishable from Figure 8a–c that the fitness of the NMPA is lowest in comparison with AO, PDO, RSA, SCA, and WOA for all independent executions. Moreover, it is also confirmed that, upon an increase in the noise levels, the fitness rises for all metaheuristics.
The performance of the NMPA is explored and compared to AO, PDO, SCA, RSA, and WOA and the average fitness plots for all variations of Gn, Pn, and the noise levels, as shown in Figure 9, Figure 10, Figure 11, Figure 12 and Figure 13. It is confirmed from Figure 9, Figure 10, Figure 11, Figure 12 and Figure 13 that the NMPA achieves the lowest fitness for all variations against other metaheuristics.
Figure 14 displays a boxplot analysis of the NMPA with AO, PDO, RSA, SCA, and WOA in terms of the average fitness for all variations. It is confirmed in Figure 14 that the NMPA achieves the lowest median and lowest first and third quartiles in comparison with the other metaheuristics.
The Wilcoxon rank sum test [61] is applied to the average fitness values of the NMPA vs. AO, NMPA vs. PDO, NMPA vs. RSA, NMPA vs. SCA, and NMPA vs. WOA. The performance values of expectation, std error, stat, and p-value are 39, 8.831, 2.038, and 0.020, respectively. These values indicate the efficacy of the NMPA against AO, PDO, RSA, SCA, and WOA.

5. Conclusions

The current study investigates the parameter estimation of nonlinear systems represented through the Hammerstein structure using the global search strength of recent novel metaheuristics. The parameters of the input nonlinear autoregressive exogenous IN-ARX model are estimated by exploiting the swarm intelligence of the nonlinear marine predators’ algorithm, the NMPA. The NMPA accurately estimates the parameters of the IN-ARX system and the detailed comparative analysis of the NMPA with other recently introduced metaheuristics, such as the Aquila optimizer, prairie dog optimization, reptile search algorithm, sine cosine algorithm, and whale optimization algorithm, demonstrating the effectiveness of the NMPA for different noise and generation variations. The statistical analyses were based on multiple autonomous executions of the scheme represented with box and whisker plots, and the Wilcoxon rank sum test further established the stable and reliable performance of the NMPA for the parameter estimations of IN-ARX systems. The promising results for nonlinear system identification reflect the potential of the NMPA to effectively solve other parameter estimation problems of complex engineering systems.

Author Contributions

Visualization, K.M. and Z.A.K.; Methodology, K.M.C.; formal analysis, Z.A.K., N.I.C. and M.A.Z.R.; writing—original draft preparation, K.M.; writing—review and editing, N.I.C., Z.A.K. and M.A.Z.R.; project administration, K.M.C., A.H.M. and A.A.; and funding acquisition, K.M.C., A.H.M. and A.A. All authors have read and agreed to the published version of the manuscript.

Funding

The Deanship of Scientific Research (DSR) at King Abdulaziz University (KAU), Jeddah, Saudi Arabia funded this project under grant no. KEP-MSc: 122-135-1443.

Data Availability Statement

Not applicable.

Acknowledgments

The author Naveed Ishtiaq Chaudhary would like to thank the support of National Yunlin University of Science and Technology through project 112T25.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ding, F.; Liu, X.P.; Liu, G. Identification methods for Hammerstein nonlinear systems. Digit. Signal Process. 2011, 21, 215–238. [Google Scholar] [CrossRef]
  2. Deng, B.; Ding, R.; Li, J.; Huang, J.; Tang, K.; Li, W. Hybrid multi-objective metaheuristic algorithms for solving airline crew rostering problem with qualification and language. Math. Biosci. Eng. 2022, 20, 1460–1487. [Google Scholar] [CrossRef] [PubMed]
  3. Chaudhary, N.I.; Khan, Z.A.; Kiani, A.K.; Raja, M.A.Z.; Chaudhary, I.I.; Pinto, C.M. Design of auxiliary model based normalized fractional gradient algorithm for nonlinear output-error systems. Chaos Solitons Fractals 2022, 163, 112611. [Google Scholar] [CrossRef]
  4. Hanafi, R.; Kozan, E. A hybrid constructive heuristic and simulated annealing for railway crew scheduling. Comput. Ind. Eng. 2014, 70, 11–19. [Google Scholar] [CrossRef]
  5. Liu, S.Q.; Kozan, E. A hybrid metaheuristic algorithm to optimise a real-world robotic cell. Comput. Oper. Res. 2017, 84, 188–194. [Google Scholar] [CrossRef]
  6. Liu, S.; Ong, H.; Ng, K. Metaheuristics for minimizing the makespan of the dynamic shop scheduling problem. Adv. Eng. Softw. 2005, 36, 199–205. [Google Scholar] [CrossRef]
  7. Mehmood, K.; Chaudhary, N.I.; Khan, Z.A.; Raja, M.A.Z.; Cheema, K.M.; Milyani, A.H. Design of Aquila Optimization Heuristic for Identification of Control Autoregressive Systems. Mathematics 2022, 10, 1749. [Google Scholar] [CrossRef]
  8. Shen, Q.; Ding, F. Least Squares Identification for Hammerstein Multi-input Multi-output Systems Based on the Key-Term Separation Technique. Circuits Syst. Signal Process. 2016, 35, 3745–3758. [Google Scholar] [CrossRef]
  9. Mehmood, A.; Raja, M.A.Z.; Shi, P.; Chaudhary, N.I. Weighted differential evolution-based heuristic computing for identification of Hammerstein systems in electrically stimulated muscle modeling. Soft Comput. 2022, 26, 8929–8945. [Google Scholar] [CrossRef]
  10. Ji, Y.; Cao, J. Parameter Estimation Algorithms for Hammerstein Finite Impulse Response Moving Average Systems Using the Data Filtering Theory. Mathematics 2022, 10, 438. [Google Scholar] [CrossRef]
  11. Mishra, B.P.; Panigrahi, T.; Wilson, A.M.; Sabat, S.L. Nonlinear channel estimation based on robust distributed Hammerstein spline adaptive technique in wireless sensor network. Digit. Signal Process. 2022, 132, 103791. [Google Scholar] [CrossRef]
  12. Sun, C.; Liu, P.; Guo, H.; Di, Y.; Xu, Q.; Hao, X. Control of Precalciner Temperature in the Cement Industry: A Novel Method of Hammerstein Model Predictive Control with ISSA. Processes 2023, 11, 214. [Google Scholar] [CrossRef]
  13. Chihi, I.; Sidhom, L.; Kamavuako, E.N. Hammerstein–Wiener Multimodel Approach for Fast and Efficient Muscle Force Estimation from EMG Signals. Biosensors 2022, 12, 117. [Google Scholar] [CrossRef] [PubMed]
  14. Price, K.; Storn, R.M.; Lampinen, J.A. Differential Evolution: A Practical Approach to Global Optimization; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  15. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  16. Sulaiman, M.H.; Mustaffa, Z.; Saari, M.M.; Daniyal, H.; Mirjalili, S. Evolutionary mating algorithm. Neural Comput. Appl. 2023, 35, 487–516. [Google Scholar] [CrossRef]
  17. Deb, K.; Anand, A.; Joshi, D. A Computationally Efficient Evolutionary Algorithm for Real-Parameter Optimization. Evol. Comput. 2002, 10, 371–395. [Google Scholar] [CrossRef]
  18. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching–learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput. Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  19. Askari, Q.; Saeed, M.; Younas, I. Heap-based optimizer inspired by corporate rank hierarchy for global optimization. Expert Syst. Appl. 2020, 161, 113702. [Google Scholar] [CrossRef]
  20. Ghasemian, H.; Ghasemian, F.; Vahdat-Nejad, H. Human urbanization algorithm: A novel metaheuristic approach. Math. Comput. Simul. 2020, 178, 1–15. [Google Scholar] [CrossRef]
  21. Chou, J.-S.; Nguyen, N.-M. FBI inspired meta-optimization. Appl. Soft Comput. 2020, 93, 106339. [Google Scholar] [CrossRef]
  22. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  23. Erol, O.K.; Eksin, I. A new optimization method: Big Bang–Big Crunch. Adv. Eng. Softw. 2006, 37, 106–111. [Google Scholar] [CrossRef]
  24. Qais, M.H.; Hasanien, H.M.; Turky, R.A.; Alghuwainem, S.; Tostado-Véliz, M.; Jurado, F. Circle Search Algorithm: A Geometry-Based Metaheuristic Optimization Algorithm. Mathematics 2022, 10, 1626. [Google Scholar] [CrossRef]
  25. Hosseini, H.S. Principal components analysis by the galaxy-based search algorithm: A novel metaheuristic for continuous optimisation. Int. J. Comput. Sci. Eng. 2011, 6, 132. [Google Scholar] [CrossRef]
  26. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  27. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine Predators Algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar] [CrossRef]
  28. Mucherino, A.; Seref, O.; Kundakcioglu, O.E.; Pardalos, P. Monkey search: A novel metaheuristic search for global optimization. AIP Conf. Proc. 2007, 953, 162–173. [Google Scholar] [CrossRef]
  29. Mirjalili, S. The ant lion optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [Google Scholar] [CrossRef]
  30. Malik, M.F.; Chang, C.-L.; Chaudhary, N.I.; Khan, Z.A.; Kiani, A.K.; Shu, C.-M.; Raja, M.A.Z. Swarming intelligence heuristics for fractional nonlinear autoregressive exogenous noise systems. Chaos Solitons Fractals 2023, 167, 113085. [Google Scholar] [CrossRef]
  31. Goethals, I.; Pelckmans, K.; Suykens, J.; De Moor, B. Subspace identification of Hammerstein systems using least squares support vector machines. IEEE Trans. Autom. Control 2005, 50, 1509–1519. [Google Scholar] [CrossRef]
  32. Chaudhary, N.I.; Raja, M.A.Z.; Khan, Z.A.; Mehmood, A.; Shah, S.M. Design of fractional hierarchical gradient descent algorithm for parameter estimation of nonlinear control autoregressive systems. Chaos Solitons Fractals 2022, 157, 111913. [Google Scholar] [CrossRef]
  33. Malik, M.F.; Chang, C.-L.; Aslam, M.S.; Chaudhary, N.I.; Raja, M.A.Z. Fuzzy-Evolution Computing Paradigm for Fractional Hammerstein Control Autoregressive Systems. Int. J. Fuzzy Syst. 2022, 24, 2447–2475. [Google Scholar] [CrossRef]
  34. Altaf, F.; Chang, C.-L.; Chaudhary, N.I.; Cheema, K.M.; Raja, M.A.Z.; Shu, C.-M.; Milyani, A.H. Novel Fractional Swarming with Key Term Separation for Input Nonlinear Control Autoregressive Systems. Fractal Fract. 2022, 6, 348. [Google Scholar] [CrossRef]
  35. Altaf, F.; Chang, C.-L.; Chaudhary, N.I.; Raja, M.A.Z.; Cheema, K.M.; Shu, C.-M.; Milyani, A.H. Adaptive Evolutionary Computation for Nonlinear Hammerstein Control Autoregressive Systems with Key Term Separation Principle. Mathematics 2022, 10, 1001. [Google Scholar] [CrossRef]
  36. Mehmood, K.; Chaudhary, N.I.; Khan, Z.A.; Cheema, K.M.; Raja, M.A.Z.; Milyani, A.H.; Azhari, A.A. Nonlinear Hammerstein System Identification: A Novel Application of Marine Predator Optimization Using the Key Term Separation Technique. Mathematics 2022, 10, 4217. [Google Scholar] [CrossRef]
  37. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  38. Migallón, H.; Belazi, A.; Sánchez-Romero, J.-L.; Rico, H.; Jimeno-Morenilla, A. Settings-Free Hybrid Metaheuristic General Optimization Methods. Mathematics 2020, 8, 1092. [Google Scholar] [CrossRef]
  39. Kumar, D.; Chauhan, Y.K.; Pandey, A.S.; Srivastava, A.K.; Kumar, V.; Alsaif, F.; Elavarasan, R.M.; Islam, R.; Kannadasan, R.; Alsharif, M.H. A Novel Hybrid MPPT Approach for Solar PV Systems Using Particle-Swarm-Optimization-Trained Machine Learning and Flying Squirrel Search Optimization. Sustainability 2023, 15, 5575. [Google Scholar] [CrossRef]
  40. Bakht, K.; Kashif, S.A.R.; Fakhar, M.S.; Khan, I.A.; Abbas, G. Accelerated Particle Swarm Optimization Algorithms Coupled with Analysis of Variance for Intelligent Charging of Plug-in Hybrid Electric Vehicles. Energies 2023, 16, 3210. [Google Scholar] [CrossRef]
  41. Kanadath, A.; Jothi, J.A.A.; Urolagin, S. Multilevel Multiobjective Particle Swarm Optimization Guided Superpixel Algorithm for Histopathology Image Detection and Segmentation. J. Imaging 2023, 9, 78. [Google Scholar] [CrossRef]
  42. Chen, X.; Long, Z. E-Commerce Enterprises Financial Risk Prediction Based on FA-PSO-LSTM Neural Network Deep Learning Model. Sustainability 2023, 15, 5882. [Google Scholar] [CrossRef]
  43. Gao, G.; Wang, J.; Zhang, J. AWOA: An Advanced Whale Optimization Algorithm for Signal Detection in Underwater Magnetic Induction Multi-Input–Multi-Output Systems. Electronics 2023, 12, 1559. [Google Scholar] [CrossRef]
  44. Mehmood, K.; Chaudhary, N.I.; Khan, Z.A.; Cheema, K.M.; Raja, M.A.Z.; Milyani, A.H.; Azhari, A.A. Dwarf Mongoose Optimization Metaheuristics for Autoregressive Exogenous Model Identification. Mathematics 2022, 10, 3821. [Google Scholar] [CrossRef]
  45. Malik, N.A.; Chaudhary, N.I.; Raja, M.A.Z. Firefly Optimization Heuristics for Sustainable Estimation in Power System Harmonics. Sustainability 2023, 15, 4816. [Google Scholar] [CrossRef]
  46. Mehmood, K.; Chaudhary, N.I.; Khan, Z.A.; Cheema, K.M.; Raja, M.A.Z. Variants of Chaotic Grey Wolf Heuristic for Robust Identification of Control Autoregressive Model. Biomimetics 2023, 8, 141. [Google Scholar] [CrossRef] [PubMed]
  47. El Sattar, M.A.; Al Sumaiti, A.; Ali, H.; Diab, A.A.Z. Marine predators algorithm for parameters estimation of photovoltaic modules considering various weather conditions. Neural Comput. Appl. 2021, 33, 11799–11819. [Google Scholar] [CrossRef]
  48. Al-Qaness, M.A.A.; Ewees, A.A.; Fan, H.; Abualigah, L.; Elaziz, M.A. Marine Predators Algorithm for Forecasting Confirmed Cases of COVID-19 in Italy, USA, Iran and Korea. Int. J. Environ. Res. Public Health 2020, 17, 3520. [Google Scholar] [CrossRef]
  49. Houssein, E.H.; Abdelminaam, D.S.; Ibrahim, I.E.; Hassaballah, M.; Wazery, Y.M. A Hybrid Heartbeats Classification Approach Based on Marine Predators Algorithm and Convolution Neural Networks. IEEE Access 2021, 9, 86194–86206. [Google Scholar] [CrossRef]
  50. Eid, A.; Kamel, S.; Abualigah, L. Marine predators algorithm for optimal allocation of active and reactive power resources in distribution networks. Neural Comput. Appl. 2021, 33, 14327–14355. [Google Scholar] [CrossRef]
  51. Helmi, A.M.; Al-Qaness, M.A.; Dahou, A.; Elaziz, M.A. Human activity recognition using marine predators algorithm with deep learning. Futur. Gener. Comput. Syst. 2023, 142, 340–350. [Google Scholar] [CrossRef]
  52. Sobhy, M.A.; Abdelaziz, A.Y.; Hasanien, H.M.; Ezzat, M. Marine predators algorithm for load frequency control of modern interconnected power systems including renewable energy sources and energy storage units. Ain Shams Eng. J. 2021, 12, 3843–3857. [Google Scholar] [CrossRef]
  53. Sadiq, A.S.; Dehkordi, A.A.; Mirjalili, S.; Pham, Q.-V. Nonlinear marine predator algorithm: A cost-effective optimizer for fair power allocation in NOMA-VLC-B5G networks. Expert Syst. Appl. 2022, 203, 117395. [Google Scholar] [CrossRef]
  54. Wang, D.; Zhang, S.; Gan, M.; Qiu, J. A Novel EM Identification Method for Hammerstein Systems with Missing Output Data. IEEE Trans. Ind. Informatics 2020, 16, 2500–2508. [Google Scholar] [CrossRef]
  55. Abualigah, L.; Yousri, D.; Elaziz, M.A.; Ewees, A.A.; Al-Qaness, M.A.; Gandomi, A.H. Aquila Optimizer: A novel meta-heuristic optimization algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  56. Ezugwu, A.E.; Agushaka, J.O.; Abualigah, L.; Mirjalili, S.; Gandomi, A.H. Prairie Dog Optimization Algorithm. Neural Comput. Appl. 2022, 34, 20017–20065. [Google Scholar] [CrossRef]
  57. Abualigah, L.; Abd Elaziz, M.; Sumari, P.; Geem, Z.W.; Gandomi, A.H. Reptile Search Algorithm (RSA): A nature-inspired meta-heuristic optimizer. Expert Syst. Appl. 2022, 191, 116158. [Google Scholar] [CrossRef]
  58. Mirjalili, S. SCA: A Sine Cosine Algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  59. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  60. Gharehchopogh, F.S.; Gholizadeh, H. A comprehensive survey: Whale Optimization Algorithm and its applications. Swarm Evol. Comput. 2019, 48, 1–24. [Google Scholar] [CrossRef]
  61. Rosner, B.; Glynn, R.J.; Lee, M.T. Incorporation of Clustering Effects for the Wilcoxon Rank Sum Test: A Large-Sample Approach. Biometrics 2003, 59, 1089–1098. [Google Scholar] [CrossRef]
Figure 1. Graphical representation of the IN-ARX model.
Figure 1. Graphical representation of the IN-ARX model.
Mathematics 11 02512 g001
Figure 2. NMPA Flowchart.
Figure 2. NMPA Flowchart.
Mathematics 11 02512 g002
Figure 3. NMPA fitness curves w.r.t noise variances.
Figure 3. NMPA fitness curves w.r.t noise variances.
Mathematics 11 02512 g003
Figure 4. Parameter tuning fitness curves of AO, PDO, RSA, and SCA.
Figure 4. Parameter tuning fitness curves of AO, PDO, RSA, and SCA.
Mathematics 11 02512 g004
Figure 5. Convergence plots w.r.t 0.0015 noise.
Figure 5. Convergence plots w.r.t 0.0015 noise.
Mathematics 11 02512 g005
Figure 6. Convergence plots w.r.t 0.015 noise.
Figure 6. Convergence plots w.r.t 0.015 noise.
Mathematics 11 02512 g006
Figure 7. Convergence plots w.r.t 0.15 noise.
Figure 7. Convergence plots w.r.t 0.15 noise.
Mathematics 11 02512 g007
Figure 8. Statistical plots w.r.t noise levels.
Figure 8. Statistical plots w.r.t noise levels.
Mathematics 11 02512 g008
Figure 9. NMPA vs. AO statistical plot w.r.t average fitness.
Figure 9. NMPA vs. AO statistical plot w.r.t average fitness.
Mathematics 11 02512 g009
Figure 10. NMPA vs. PDO statistical plot w.r.t average fitness.
Figure 10. NMPA vs. PDO statistical plot w.r.t average fitness.
Mathematics 11 02512 g010
Figure 11. NMPA vs. RSA statistical plot w.r.t average fitness.
Figure 11. NMPA vs. RSA statistical plot w.r.t average fitness.
Mathematics 11 02512 g011
Figure 12. NMPA vs. SCA statistical plot w.r.t average fitness.
Figure 12. NMPA vs. SCA statistical plot w.r.t average fitness.
Mathematics 11 02512 g012
Figure 13. NMPA vs. WOA statistical plot w.r.t average fitness.
Figure 13. NMPA vs. WOA statistical plot w.r.t average fitness.
Mathematics 11 02512 g013
Figure 14. Boxplot analysis of NMPA with AO, PDO, RSA, SCA, and WOA.
Figure 14. Boxplot analysis of NMPA with AO, PDO, RSA, SCA, and WOA.
Mathematics 11 02512 g014
Table 1. Parameter settings of the metaheuristics.
Table 1. Parameter settings of the metaheuristics.
MetaheuristicsParameter Values
NMPAFAD = 0.2, E = 0.5
AOalpha = 0.1, delta = 0.1
PDOrho = 0.005
RSAalpha = 0.1, beta = 0.1
SCAa = 2
WOA [60]a = [2 0]
Table 2. Estimated weights w.r.t generations (Gn) at the 0.0015 noise level.
Table 2. Estimated weights w.r.t generations (Gn) at the 0.0015 noise level.
MetaheuristicsGnDesign ParametersBest Fitness
f 1 f 2 d 1 d 2 b 1 b 2 b 3
AO330−0.81920.93680.98910.40640.74391.6007−0.23510.2012
660−0.77550.90141.57701.12520.41830.4834−0.88110.0499
PDO330−0.80500.89391.60150.81070.34550.5423−0.70530.0208
660−0.79030.89781.33080.89730.49880.7399−0.70470.0024
RSA330−0.73260.83851.28401.71051.20030.8957−0.98710.7066
660−0.74210.86421.11300.95100.86841.1687−0.56340.1912
SCA330−0.71860.85370.91500.61951.00842.00000.00050.5935
660−0.76480.92001.75252.0000−0.2438−0.2213−1.21190.2915
WOA330−0.84660.95210.98780.65700.67831.1143−0.63930.1901
660−0.78620.90480.87710.68771.34081.8431−0.35220.0331
NMPA330−0.79970.89971.12840.69820.85211.2310−0.52312.4132 × 10−5
660−0.80000.89991.10670.68370.88611.2821−0.50524.7671 × 10−6
Actual Values−0.80000.90001.10000.68000.90001.3000−0.50000
Table 3. Estimated weights w.r.t generations (Gn) at the 0.015 noise level.
Table 3. Estimated weights w.r.t generations (Gn) at the 0.015 noise level.
MetaheuristicsGnDesign ParametersBest Fitness
f 1 f 2 d 1 d 2 b 1 b 2 b 3
AO330−0.79420.87751.16630.76800.67760.9902−0.62800.0561
660−0.79450.90991.52331.03250.06910.3455−0.79810.0955
PDO330−0.78940.89411.45230.91530.43780.6399−0.71820.0038
660−0.79590.89931.08150.73270.80411.2129−0.53170.0075
RSA330−0.72200.91081.12431.01100.56961.0461−0.51760.4078
660−0.77120.88831.00200.98020.80401.0051−0.70280.2483
SCA330−0.79970.93401.89971.4184−0.1877−0.0817−0.95270.2136
660−0.80260.88702.00001.23790.03530.0226−0.95810.0586
WOA330−0.83740.88531.07030.24891.04431.95620.02930.2150
660−0.78640.89121.85731.13970.05510.1463−0.87730.0053
NMPA330−0.80020.89881.18050.72460.74521.1017−0.55844.7301 × 10−4
660−0.80030.89891.17510.72220.75561.1140−0.55474.6813 × 10−4
Actual Values−0.80000.90001.10000.68000.90001.3000−0.50000
Table 4. Estimated weights w.r.t generations (Gn) at the 0.15 noise level.
Table 4. Estimated weights w.r.t generations (Gn) at the 0.15 noise level.
MetaheuristicsGnDesign ParametersBest Fitness
f 1 f 2 d 1 d 2 b 1 b 2 b 3
AO330−0.77550.88911.91091.8204−0.1287−0.1344−1.00750.1621
660−0.81770.90211.05890.64131.17411.5627−0.40290.1044
PDO330−0.81930.90850.96500.61711.03021.5850−0.36280.0663
660−0.82180.89721.06340.59110.82801.3622−0.39710.0608
RSA330−0.76400.90091.00930.94750.74961.2128−0.52170.2810
660−0.78090.90420.98440.90970.53740.8729−0.78740.3593
SCA330−0.77190.92521.51291.79240.0006−0.0261−1.11250.4162
660−0.78680.90221.60561.1461−0.25370.0269−0.88980.2069
WOA330−0.76910.90671.03301.15041.02781.1263−0.76000.3160
660−0.80860.90490.94530.66980.79691.4578−0.33110.1322
NMPA330−0.80330.89481.99971.1901−0.1187−0.0161−0.88320.0405
660−0.80350.89432.00001.1789−0.1210−0.0139−0.87810.0405
Actual Values−0.80000.90001.10000.68000.90001.3000−0.50000
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mehmood, K.; Chaudhary, N.I.; Cheema, K.M.; Khan, Z.A.; Raja, M.A.Z.; Milyani, A.H.; Alsulami, A. Design of Nonlinear Marine Predator Heuristics for Hammerstein Autoregressive Exogenous System Identification with Key-Term Separation. Mathematics 2023, 11, 2512. https://doi.org/10.3390/math11112512

AMA Style

Mehmood K, Chaudhary NI, Cheema KM, Khan ZA, Raja MAZ, Milyani AH, Alsulami A. Design of Nonlinear Marine Predator Heuristics for Hammerstein Autoregressive Exogenous System Identification with Key-Term Separation. Mathematics. 2023; 11(11):2512. https://doi.org/10.3390/math11112512

Chicago/Turabian Style

Mehmood, Khizer, Naveed Ishtiaq Chaudhary, Khalid Mehmood Cheema, Zeshan Aslam Khan, Muhammad Asif Zahoor Raja, Ahmad H. Milyani, and Abdulellah Alsulami. 2023. "Design of Nonlinear Marine Predator Heuristics for Hammerstein Autoregressive Exogenous System Identification with Key-Term Separation" Mathematics 11, no. 11: 2512. https://doi.org/10.3390/math11112512

APA Style

Mehmood, K., Chaudhary, N. I., Cheema, K. M., Khan, Z. A., Raja, M. A. Z., Milyani, A. H., & Alsulami, A. (2023). Design of Nonlinear Marine Predator Heuristics for Hammerstein Autoregressive Exogenous System Identification with Key-Term Separation. Mathematics, 11(11), 2512. https://doi.org/10.3390/math11112512

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop