2.1. Beetle Swarm Antenna Search Algorithm
The beetle swarm antenna search (BSAS) algorithm is a new intelligent optimization algorithm which was proposed in 2018 [
23,
24]. It is a variant algorithm based on a beetle antennae search (BAS) algorithm which was proposed in 2017 [
25]. The BAS algorithm optimizes the optimization index by imitating the predatory behavior of the beetle. When the beetle is preying, the two antennae on the left and right sides of its head will receive food odour from two directions respectively. According to the odour intensity in two directions, the beetle will choose to approach the direction with stronger odour intensity.
The BSAS algorithm is an improved algorithm of BAS. It expands the number of beetles from a single one in the BAS algorithm to a beetle group, which greatly improves the optimization ability and stability of the algorithm. Besides, compared with the BAS algorithm, the BSAS algorithm adds some additional parameters and improves the step size update strategy, which greatly improves the ability to solve complex problems.
The intensity of food odor can be expressed by the function to be optimized, (i.e., the fitness function), and then the biological behavior expressed by BSAS algorithm can be expressed by mathematical formula. The search direction of each beetle in the beetle group is randomly initialized at each search, as shown in Equation (
1).
where
i is the currently selected beetle and
n is the dimension of the variable to be optimized.
$rnd(\xb7)$ denotes random function which can generate a random direction
${d}_{ir}$. The two antennae used to receive odor intensity can be expressed as follows:
In Equation (
2),
k represents the current iteration times, and
${x}_{li}$ and
${x}_{ri}$ represent the left and right antennae of the
ith beetle.
${d}^{k}$ is sensing distance which denotes the length between the antennae and the head of beetle.
${d}^{k}$ can be expressed as Equation (
3), where
${d}_{0}$ is the minimum resolution of sensing distance, it needs to be set in advance.
${c}_{d}$ is the decay coefficient of sensing distance.
Based on the concept of beetle group in BSAS algorithm, the position
${x}_{i}^{k}$ of each beetle in the beetle group can be obtained as shown in Equation (
4)
where
$f(\xb7)$ represents the odor intensity (i.e., the fitness function).
$sign$ is symbolic function, which can choose a more ideal optimization direction for the next iteration.
${s}^{k}$ is search step size, as shown in Equation (
5),
${s}_{0}$ is the minimum resolution of the search step size, and
${c}_{s}$ is the decay coefficient of the search step size.
In addition, compared with the BAS algorithm, two probability constants ${p}_{st}$ and ${p}_{pos}$, which are between 0 and 1, are added to the BSAS algorithm. Where ${p}_{st}$ is used to control the step size update. It means that when no better position is found, the probability that the step size will be updated in the next iteration is ${p}_{st}$, and the probability that it will not be updated is $(1{p}_{st})$. By setting the ${p}_{st}$ parameter, the search step size will no longer be unconditionally updated and reduced, thus it can improve the ability of global optimization. Moreover, in order to ensure the efficiency of the algorithm and avoid the bad results caused by improper ${p}_{st}$ parameter setting, the BSAS algorithm proposes the ${n}_{st}$ parameter. ${n}_{st}$ represents the maximum number of invalid searches by using the same step size. When the cumulative number exceeds ${n}_{st}$, in the next iteration, the search step size will be forced to update.
Similar to ${p}_{st}$, ${p}_{pos}$ is used to control beetle position update. It means that in each iteration, if M beetles in the beetle group find a better position, then in the next iteration, the probability of beetle being updated to the best position is ${p}_{pos}$. On the contrary, the probability that the beetle will be updated to a better position rather than the best position is $(1{p}_{pos})$. By setting ${p}_{pos}$ parameter, the disturbance of local optimal value can be effectively reduced, and the ability of global optimization can be improved.
In summary, combined with the mathematical expression and biological characteristics of BSAS algorithm, the pseudocode of BSAS algorithm is shown in Algorithm 1.
Algorithm 1 BSAS algorithm 
Input: Define the fitness function $f(\xb7)$. Set the optimization variable x to be optimized and determine its dimension n. 
Output:${x}_{best}$ and ${f}_{best}$. 
1: Initialize: 
Initialize the position of beetle group ${x}_{i}^{0}$ and probability constant ${p}_{pos}$; 
Initialize the search step size ${s}^{0}$ and decay coefficient ${c}_{s}$; 
Initialize probability constant ${p}_{st}$ and set parameter ${n}_{st}$; 
Initialize the sensing distance ${d}^{0}$ and decay coefficient ${c}_{d}$; 
Initialize the maximum number of iterations ${k}_{max}$; 
Set initial optimization results for ${x}_{best}$ and ${f}_{best}$. 
2: if $k<={k}_{max}$ then 
3: Generate random directions ${\overrightarrow{d}}_{iri}$ for each beetle by Equation (1). 
4: Calculate the antennae position ${x}_{li}^{k}$ and ${x}_{ri}^{k}$ of each beetle by Equation (2). 
5: Calculate the position of each beetle ${x}_{i}^{k}$ by Equation (4), and calculate the fitness function value $f\left({x}_{i}^{k}\right)$. 
6: Compare all $f\left({x}_{i}^{k}\right)$ and ${f}_{best}$ in this iteration. 
7: if ∃$f\left({x}_{i}^{k}\right)<{f}_{best}$ then 
8: if $rand\left(1\right)=a<{p}_{pos}$ then 
9: ${x}_{best}$ =$argmin\left(f\left({x}_{i}^{k}\right)\right)$ 
10: ${f}_{best}$ =$min\left(f\left({x}_{i}^{k}\right)\right)$ 
11: else 
12: ${x}_{best}$ =$sample\left({x}_{i}^{k}\right)$ 
13: ${f}_{best}$ =$\left(f\left({x}_{sample}^{k}\right)\right)$ 
14: end if 
15: else 
16: if $rand\left(1\right)=b<{p}_{st}\left\righti>{n}_{st}$ then 
17: Search step ${s}^{k}$ is updated by Equation (5). 
18: Sensing distance ${d}^{k}$ is updated by Equation (3). 
19: else 
20: $i=i+1$ 
21: end if 
22: end if 
23: $k=k+1$ 
24:end if 
Compared with the BAS algorithm, the BSAS algorithm has greatly improved its optimization and stability ability by increasing the number of beetles. Furthermore, the BSAS algorithm introduces additional parameters in the step size strategy and position update strategy, which makes it more likely to find the global optimal value. Because of the advantages mentioned above, the BSAS algorithm is widely used in many fields [
26,
27,
28].
2.2. Variational Mode Decomposition Algorithm
Dragomiretskiy et al. proposed a variational mode decomposition algorithm (VMD) in 2014, which can decompose the signal into
K bandlimited intrinsic mode functions (BLIMFs) [
29]. The center frequency of each modal component is
${\omega}_{k}$.
Firstly, the estimation of the center frequency and bandwidth for BLIMF components is transformed into a variational problem with constraints by using VMD algorithm. Then, by introducing penalty factor and Lagrange function, the problem with constraint condition is transformed into the problem without constraint condition. Finally, the alternative multiplier algorithm is used to get the optimal solution. The VMD algorithm can be regarded as an improved algorithm of Wiener filter [
30]. It effectively solves the problem of modal aliasing in the EMD algorithm [
31]. The specific VMD algorithm can be expressed as follows:
To begin with, the signal
$x\left(t\right)$ is decomposed into
K BLIMF components
${u}_{k}\left(t\right)$,
$k=1,2,\cdots ,K$, as shown in Equations (
6) and (
7).
The instantaneous frequency
${\omega}_{k}\left(t\right)$ of
${u}_{k}\left(t\right)$ is presented as Equation (
8):
After that, the Hilbert transform is applied to each mode component, and the frequency spectrum of each BLIMF component is modulated to the frequency band with the center frequency
${\omega}_{k}$, as shown in Equation (
9):
where
$\delta \left(t\right)$ is the Dirac distribution, and
${\omega}_{k}$ is the center frequency. * represents symbol of convolution operation. Next, the Gaussian smoothness of the transformed signal can be used to estimate the bandwidth of each component. The variational problem with constraints is given as Equation (
10):
By introducing quadratic penalty factor
$\alpha $ and Lagrange multiplication operator
$\lambda \left(t\right)$, the constrained variational problem can be transformed into unconstrained variational problem. It should be noted that the augmented Lagrangian is expressed as Equation (
11)
Alternate direction method of multipliers (ADMM) is used to solve the unconstrained variational problem, and the saddle point is obtained by updating
${u}_{k}^{n+1}$,
${\omega}_{k}^{n+1}$, and
${\lambda}^{n+1}$ in frequency respectively. The specific formula is denoted as Equations (
13)–(
15). Where component
${u}_{k}^{n+1}\left(t\right)$ can be expressed as Equation (
12).
where
${}^{\wedge}$ represents the Fourier transform,
n is the number of iterations, and
$\tau $ denotes the parameter of noise tolerance. Finally, according to Equations (
13)–(
15),
${u}_{k}^{n+1}$,
${\omega}_{k}^{n+1}$, and
${\lambda}^{n+1}$ are continuously updated until the iteration termination condition is met. The iteration termination condition is shown in Equation (
16):
In conclusion, the detailed steps of the VMD algorithm are shown in Algorithm 2.
Algorithm 2 VMD algorithm 
 1:
Initialize $\left\{{\widehat{u}}_{k}\right\},\left\{{w}_{k}\right\},\left\{\widehat{\lambda}\right\}$ and n.  2:
Update ${\widehat{u}}_{k}^{n+1}\left(\omega \right)$ by Equation ( 13).  3:
Update ${\omega}_{k}^{n+1}$ by Equation ( 14).  4:
Update ${\widehat{\lambda}}^{n+1}\left(\omega \right)$ by Equation ( 15).  5:
Repeat the above updating steps until the iteration termination condition Equation ( 16) is met.
