1. Introduction
Photoelectric tracking systems, as a class of integrated optomechatronics widely implemented in astronomy, communication, positioning, and tracking, present a critical and challenging issue in enhancing their overall precision [
1]. The controller within the control system is pivotal in determining the tracking performance [
2]; hence, optimizing the controller’s parameters is a necessary measure to improve this performance [
3].
In addressing optimization problems, intelligent optimization algorithms have initiated a new chapter, garnering extensive research and application since their inception [
2]. These algorithms have afforded better solutions to complex problems. However, with further exploration, it has been observed that each algorithm has its limitations, stemming from its inherent characteristics and scope of application [
4]. Consequently, scholars have sought to improve these algorithms by combining two, leveraging their strengths and compensating for their weaknesses [
5].
PSO and BAS algorithms have been utilized to solve optimization problems, entailed by basic mathematical operations and minimal computational and environmental requirements, and thus widely applied in various parameter optimizations [
6,
7]. Ye et al. proposed the Beetle Swarm Antennae Search (BSAS) algorithm, incorporating a feedback step-size update strategy, and addressing issues related to the algorithm’s over-reliance on the beetle’s random direction and the need for frequent updates of beetle positions and step sizes [
8]. Lei et al. suggested the Beetle Antennae Search with Flower Pollination Algorithm (BFPA), applying butterfly pollination strategies for global search and BAS for local searches to enhance search performance and convergence rates [
8]. Xu et al. introduced the Levy flight and adaptive strategy-based BAS (LABAS), grouping beetles, updating swarm information with elite individuals, and incorporating generalized oppositional learning in the initial and elite populations with Levy flight and proportion factors [
9]. Fan et al. applied a BAS and PID hybrid controller for electro-hydraulic servo systems, significantly enhancing system performance and effectively meeting control demands [
10]. However, their limitations lie in focusing on a single individual. As the number of iterations increases, the step size decays, causing it to get stuck in local extrema and preventing escape. Although setting a fixed or larger initial step size can avoid this, it results in poor stability.
Sedighizadeh et al. developed a Generalized PSO (GEPSO) algorithm, incorporating two new items into the velocity update rule and adopting a novel dynamic inertia weight update strategy, outperforming other variants in run time and fitness values [
11]. Aydilek proposed a hybrid algorithm combining Firefly Algorithm (FA) and PSO (HFPSO), using PSO for global search and FA for local search, rapidly and reliably finding optimal solutions [
12]. Dhanachandra et al. integrated Dynamic Particle Swarm Optimization (DPSO) with Fuzzy C-Means Clustering (FCM), using DPSO to find FCM’s optimal solutions and applying the improved method to image segmentation and noise reduction, demonstrating enhanced noise resistance and performance [
13]. However, the PSO algorithm has several issues: slow convergence speed requiring a large number of iterations, a tendency to get trapped in local optima, and difficulty escaping from them. It is also sensitive to the distribution of the solution space, where uneven distribution can lead to inefficient contraction.
The BAS algorithm displays robust performance, high precision, and strong global search capabilities, but its limitation arises from its focus on individual particles [
14]. As iterations increase and step sizes decay, it may become trapped in local optima, despite solutions like fixed step sizes or larger initial step sizes introducing instability. The PSO algorithm, on the other hand, addresses a collective, aiding BAS by increasing diversity and rectifying its shortcomings [
15]. Moreover, BAS’s traits facilitate rapid identification of the optimal solution in PSO’s initial stages, hastening convergence. This consideration led to the hybridization of BAS and PSO, aiming to capitalize on the individual advantages of each algorithm and enhance overall performance.
In this study, we integrate the beetle antennae search strategy into the PSO local search, proposing an improved algorithm referred to as W-K-BSO. Our principal contributions are as follows:
We utilize chaotic mapping to optimize population diversity, resulting in a stochastic and uniform distribution of individuals, thereby enhancing the algorithm’s convergence rate without compromising the inherent randomness of the population.
The inertia weight is updated using a linearly decreasing differential approach, mitigating the deficiency of conventional linear decrement strategies that may fail to identify the optimal value direction in the initial stages, leading to convergence towards local optima in later stages.
We synergistically control the algorithm using a contraction factor and inertia weight, ensuring convergence while effectively managing global and local search performance. The contraction factor includes an acceleration coefficient derived from the beetle’s antennae position increments. These three factors collectively dictate the updating mechanism. By treating the particle’s position as the centroid of the beetle, we generate positions for the beetle’s left and right antennae, calculate their fitness values, and create new positions and increments. These new positions serve as the current positions of the particles. We also incorporate the impact of antennae position increments on the current particle positions into the velocity update rule. Finally, we test the algorithm using benchmark functions and apply it to the optimization of PID in photovoltaic tracking systems.
The subsequent sections of this paper are arranged as follows:
Section 2 discusses relevant theories,
Section 3 presents the local search Particle Swarm Optimization algorithm based on the beetle antennae search algorithm,
Section 4 conducts a simulation analysis,
Section 5 details the experimental validation, and
Section 6 provides a summary of the paper.
4. Simulation Analysis
As inferred from the optimization process and the algorithmic model of the W-K-BSO algorithm, several initialization parameters require setting, which include: population size
, maximum number of iterations
, initial step size
, step size reduction factor
, spatial dimensionality
, and acceleration coefficients
. The initial step size can be tailored based on different application scenarios. The specific parameter settings are as follows in the
Table 2:
To validate the effectiveness of the W-K-BSO algorithm, simulations were conducted to compare its performance with other algorithms, using the minimization of nine benchmark test functions. To ensure the diversity of the test functions and universality of the conclusions, both unimodal and multimodal functions with distinct characteristics were employed. The parameter settings for the selected functions are detailed in the following
Table 3:
The W-K-BSO algorithm was compared with the BAS and SPSO algorithms through simulations. For BAS and W-K-BSO, the step size reduction factors were set as and , respectively, with initial step sizes of and . The maximum number of iterations was for SPSO and W-K-BSO. Population sizes for SPSO and W-K-BSO were set to , and the maximum and minimum velocities were and , respectively. The maximum and minimum inertia weights for the W-K-BSO algorithm were denoted as and . The inertia weight for SPSO was , and the acceleration coefficients for all algorithms were . The dimensionality was (set as 2 dimensions for the Schaffer function). Each algorithm was run independently 30 times to compare the best, worst, average, and standard deviation values of the BAS, SPSO, and W-K-BSO algorithms across different dimensions of the selected benchmark functions. The results obtained by the proposed algorithm in this study are presented in bold.
The
Table 4 and
Table 5 present the best (Best), worst (Worst), average (Avg), and standard deviation (Sd) values for the BAS, SPSO, and W-K-BSO algorithms across nine benchmark functions with dimensions of 30, 100, 500, and 1000. A holistic analysis of the BAS, SPSO, and W-K-BSO algorithms, whether applied to multimodal or unimodal functions, shows that the W-K-BSO algorithm significantly outperforms both the BAS and SPSO algorithms, demonstrating its robust performance. Moreover, multimodal functions possess multiple local optima, which can easily trap the algorithms in local optima, preventing convergence to the global optimum. The data in the table effectively demonstrate the effectiveness of the W-K-BSO algorithm against such scenarios. The W-K-BSO algorithm can effectively find the theoretical minimum value of 0 for the functions
under low or high dimensionality, and a standard deviation of 0, highlighting the algorithm’s high stability. Although convergence to 0 for functions
was not achieved, the optimization outcomes were still significantly enhanced. For two other functions
, the optimization effects were more pronounced, and the standard deviation indicated a higher stability of the W-K-BSO algorithm across these functions. In terms of another function
, the W-K-BSO algorithm improved the performance by 7 to 9 orders of magnitude compared to the BAS algorithm and by 4 to 8 orders of magnitude compared to the SPSO algorithm across different dimensions.
To further contrast the optimization efficacy of the algorithms, the
Figure 4 presents the convergence comparison curves of BAS, SPSO, CFPSO, W-K-PSO, and W-K-BSO for 100-dimensional functions (with Schaffer set at 2 dimensions).
Figure 4a shows that before 350 iterations, the convergence speed, from fastest to slowest, is as follows: CFPSO, W-K-PSO, SPSO, BAS. However, the slopes of their evolution curves are all smaller than that of W-K-PSO, indicating that in the initial optimization phase, W-K-BSO is the fastest, far outpacing the other algorithms. After 350 iterations, as seen in the inset, BAS, PSO, CFPSO, and W-K-PSO have lg(fitness) > 0, suggesting these algorithms have become trapped in local optima and cannot find the global optimum. In contrast, W-K-BSO has lg(fitness) < −300 and stops iterating around 950 iterations, demonstrating that W-K-BSO finds the global optimum with a standard deviation of 0, indicating good convergence and stability. Therefore, for optimizing the
function, the convergence speeds of the five algorithms from highest to lowest are: W-K-BSO, CFPSO, W-K-PSO, SPSO, BAS, with only W-K-BSO finding the global optimum.
Figure 4b shows that BAS experiences early optimization stagnation. Initially, BAS’s convergence performance is better than SPSO, CFPSO, and W-K-PSO, but after 1000 iterations, SPSO and CFPSO overtake BAS. Although their convergence curves flatten, they still search in the correct direction. The W-K-BSO algorithm finds the theoretical optimum value of the
function by the 45th iteration, showing significantly better convergence performance than the other algorithms. Thus, for optimizing the
function, the convergence speeds from highest to lowest are: W-K-BSO, BAS, SPSO, CFPSO, W-K-PSO, with the order of finding the global optimum being: W-K-BSO, SPSO, CFPSO.
Figure 4c indicates that none of the five algorithms found the optimal value of 0 for the
function. The optimization effect of W-K-BSO on the
function is not very pronounced, with limited accuracy improvement. However, W-K-BSO still outperforms other algorithms in terms of convergence speed and accuracy. BAS has a faster convergence speed than SPSO, CFPSO, and W-K-PSO, but SPSO and CFPSO have slightly higher accuracy than BAS. The proposed algorithm shows a small standard deviation, indicating good stability. Therefore, for optimizing the
function, the convergence speeds from highest to lowest are: W-K-BSO, BAS, SPSO, CFPSO, W-K-PSO.
Figure 4d,f show that the W-K-BSO algorithm exhibits good convergence performance from the initial optimization stage, finding the optimal value in fewer than 100 iterations, demonstrating robust and global convergence. The other four algorithms show comparable optimization performance but experience early optimization stagnation. For the
function, under the same number of iterations, the algorithm accuracies from highest to lowest are: W-K-BSO, BAS, CFPSO, SPSO, W-K-PSO. For the
function, the accuracies from highest to lowest are: W-K-BSO, SPSO, CFPSO, W-K-PSO, BAS.
Figure 4e shows that the W-K-BSO algorithm initially exhibits fast convergence speed but falls into a local optimum at the 25th iteration. Nevertheless, W-K-BSO achieves higher convergence accuracy. Therefore, for optimizing the
function, the convergence speeds from highest to lowest are: W-K-BSO, BAS, CFPSO, SPSO, W-K-PSO.
Figure 4g indicates that all five algorithms find the optimal value before 350 iterations, but W-K-BSO converges to the optimal value of the
function much faster than the other four algorithms. As shown in
Table 4 and
Table 5, the SPSO algorithm occasionally finds the optimal value but is unstable, while the proposed W-K-BSO algorithm converges to the theoretical optimum with a 100% success rate in 30 independent runs, indicating good algorithm stability. Thus, for optimizing the
function, the convergence speeds from highest to lowest are: W-K-BSO, W-K-PSO, SPSO, BAS, CFPSO.
Figure 4h shows that, similar to the
function, CFPSO, W-K-PSO, SPSO, and BAS fall into local optima and cannot escape, whereas W-K-PSO escapes the local optimum and finds the global optimum. Therefore, for optimizing the
function, the convergence speeds from highest to lowest are: W-K-BSO, W-K-PSO, CFPSO, SPSO, BAS.
Figure 4i shows that the W-K-BSO algorithm initially converges rapidly, but its convergence speed slows down afterward. Although it continues to approach the optimal value until the maximum number of iterations is reached, the time required to escape local optima increases. Overall, W-K-BSO has the highest accuracy. Therefore, for optimizing the
function, the convergence speeds from highest to lowest are: W-K-BSO, W-K-PSO, CFPSO, SPSO, BAS.
Based on the above simulation results, we have created a comparative table, as shown in
Table 6.
In conclusion, the W-K-BSO algorithm of this study demonstrates a robust search performance across various benchmark functions and dimensionalities, achieving high accuracy. The algorithm can rapidly converge to the optimal values for some functions , albeit more slowly for others , still successfully identifies the optimal values. Despite having relatively weaker optimization capabilities for certain functions , the algorithm consistently exhibits enhanced convergence accuracy compared to other methods. The stability and efficacy of the improved algorithm are evident across the convergence scenarios of various functions.
The Wilcoxon signed-rank test is employed to determine whether there are significant differences between two independent sample groups. In the Wilcoxon signed-rank test, the test statistic is calculated by summing the ranks of the absolute differences between the observed values and the central position under the null hypothesis, each assigned with the appropriate sign. Unlike the T-test, the Wilcoxon signed-rank test does not require the differences between paired data to follow a normal distribution; it only requires symmetry. Therefore, it is suitable for comparing the optimization results of different algorithms on various test functions. The Wilcoxon signed-rank test is conducted under the assumption of symmetric sample data, providing more information than the sign test, thereby yielding more reliable results. This method can be used to pairwise compare the statistical results, such as means or variances, obtained from different optimization algorithms across various test functions, allowing for a unified qualitative analysis of their optimization performance.
In this study, we employed the Wilcoxon signed-rank test to analyze the performance of the proposed method, setting the significance level at 0.05.
Table 7 presents the Wilcoxon signed-rank test results for the optimal values of nine test functions after 30 independent runs. Here, “win” indicates the superior algorithm in each comparison; “+” denotes that the W-K-BSO algorithm outperformed the compared algorithm, “−” indicates that the W-K-BSO algorithm underperformed compared to the compared algorithm, and “=” signifies no significant difference in performance between the two algorithms. From
Table 7, by statistically analyzing the “+/=/−” values, it is evident that the W-K-BSO algorithm generally demonstrates superior convergence speed and search accuracy compared to the BAS algorithm, SPSO algorithm, CFPSO algorithm, and W-K-PSO algorithm across the nine different test functions.
5. Experimental Validation
The
Figure 5 illustrates a high-precision Pointing and Tracking System (PTS) subsystem with a dual-axis, dual-tip tilt mirror tracking system. In the diagram, the high-precision PTS primarily consists of three parts: A target simulation unit that includes a signal sending device (laser) and a target mirror movement adjustment device, a monitor consisting of a position-sensitive detector and an image charge-coupled device (CCD), and finally, the control device (tracking mirror). Initially, the laser is utilized as the signal sending device to simulate the motion of the target. Moreover, the target mirror within the target simulation unit can adjust the motion of the target. Subsequently, the tracking mirror, driven by a motor, reflects the signal light onto the position-sensitive detector. After receiving the target signal, the CCD detects the steady-state error and transmits this error to the position controller.
The control object transfer function is as follows:
In this experiment, the proposed method is applied to the electro-optical tracking system to test its performance and draw conclusions through comparison.
Using the Integral Absolute Error (IAE) as the algorithm fitness function, the optimal PID parameters
obtained from the search are substituted into the PID of the electro-optical tracking system, producing the step response and error curve. During the experiment, the population size is set to
, and the number of iterations is
. The PID parameter optimization curve and the fitness curve of the W-K-BSO algorithm are shown in the
Figure 6.
The PID parameters optimized by the W-K-BSO algorithm are listed in the
Table 8, with the best fitness value
:
The step response and error curves obtained using the PID parameters optimized by the W-K-BSO algorithm for the electro-optical tracking system are shown in
Figure 7.
The PID parameter tuning of the electro-optical tracking system was conducted using the W-K-BSO, SPSO, and GA algorithms under identical experimental conditions. A comprehensive comparison of the step response and error curves of the three algorithms was analyzed. In the
Figure 8, the rin curve represents the system input signal, and the SPSO, W-K-BSO, and GA curves represent the output signals for their respective algorithms.
From the
Figure 8, the analysis of the parameter tuning results from the W-K-BSO, SPSO, and GA algorithms indicates that the GA algorithm has the fastest convergence rate among the three but also exhibits the largest error. The convergence rate of the W-K-BSO algorithm is slightly slower than the SPSO algorithm, yet it has the smallest error. Although the convergence speed of the W-K-BSO algorithm does not show a significant advantage, its error rate is clearly superior to that of the GA and SPSO algorithms. The GA algorithm begins to stabilize around 0.05 s, reaching a fixed value at approximately 0.07 s. Both the W-K-BSO and SPSO algorithms achieve stability in less than 0.05 s.